[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 18823 1726855007.93006: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-ZzD executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 18823 1726855007.93474: Added group all to inventory 18823 1726855007.93476: Added group ungrouped to inventory 18823 1726855007.93480: Group all now contains ungrouped 18823 1726855007.93483: Examining possible inventory source: /tmp/network-Koj/inventory.yml 18823 1726855008.08386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 18823 1726855008.08451: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 18823 1726855008.08474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 18823 1726855008.08538: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 18823 1726855008.08614: Loaded config def from plugin (inventory/script) 18823 1726855008.08616: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 18823 1726855008.08656: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 18823 1726855008.08746: Loaded config def from plugin (inventory/yaml) 18823 1726855008.08748: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 18823 1726855008.08837: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 18823 1726855008.09277: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 18823 1726855008.09280: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 18823 1726855008.09283: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 18823 1726855008.09290: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 18823 1726855008.09297: Loading data from /tmp/network-Koj/inventory.yml 18823 1726855008.09363: /tmp/network-Koj/inventory.yml was not parsable by auto 18823 1726855008.09434: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 18823 1726855008.09482: Loading data from /tmp/network-Koj/inventory.yml 18823 1726855008.09568: group all already in inventory 18823 1726855008.09575: set inventory_file for managed_node1 18823 1726855008.09580: set inventory_dir for managed_node1 18823 1726855008.09581: Added host managed_node1 to inventory 18823 1726855008.09583: Added host managed_node1 to group all 18823 1726855008.09584: set ansible_host for managed_node1 18823 1726855008.09584: set ansible_ssh_extra_args for managed_node1 18823 1726855008.09589: set inventory_file for managed_node2 18823 1726855008.09592: set inventory_dir for managed_node2 18823 1726855008.09593: Added host managed_node2 to inventory 18823 1726855008.09597: Added host managed_node2 to group all 18823 1726855008.09598: set ansible_host for managed_node2 18823 1726855008.09599: set ansible_ssh_extra_args for managed_node2 18823 1726855008.09602: set inventory_file for managed_node3 18823 1726855008.09605: set inventory_dir for managed_node3 18823 1726855008.09606: Added host managed_node3 to inventory 18823 1726855008.09607: Added host managed_node3 to group all 18823 1726855008.09608: set ansible_host for managed_node3 18823 1726855008.09609: set ansible_ssh_extra_args for managed_node3 18823 1726855008.09611: Reconcile groups and hosts in inventory. 18823 1726855008.09615: Group ungrouped now contains managed_node1 18823 1726855008.09617: Group ungrouped now contains managed_node2 18823 1726855008.09618: Group ungrouped now contains managed_node3 18823 1726855008.09696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 18823 1726855008.09792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 18823 1726855008.09849: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 18823 1726855008.09877: Loaded config def from plugin (vars/host_group_vars) 18823 1726855008.09879: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 18823 1726855008.09886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 18823 1726855008.09896: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 18823 1726855008.09949: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 18823 1726855008.10325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855008.10435: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 18823 1726855008.10474: Loaded config def from plugin (connection/local) 18823 1726855008.10477: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 18823 1726855008.11256: Loaded config def from plugin (connection/paramiko_ssh) 18823 1726855008.11300: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 18823 1726855008.13324: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18823 1726855008.13365: Loaded config def from plugin (connection/psrp) 18823 1726855008.13368: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 18823 1726855008.14170: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18823 1726855008.14214: Loaded config def from plugin (connection/ssh) 18823 1726855008.14218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 18823 1726855008.16391: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18823 1726855008.16432: Loaded config def from plugin (connection/winrm) 18823 1726855008.16435: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 18823 1726855008.16474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 18823 1726855008.16539: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 18823 1726855008.16658: Loaded config def from plugin (shell/cmd) 18823 1726855008.16660: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 18823 1726855008.16705: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 18823 1726855008.16772: Loaded config def from plugin (shell/powershell) 18823 1726855008.16774: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 18823 1726855008.16836: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 18823 1726855008.17046: Loaded config def from plugin (shell/sh) 18823 1726855008.17048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 18823 1726855008.17082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 18823 1726855008.17213: Loaded config def from plugin (become/runas) 18823 1726855008.17216: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 18823 1726855008.17416: Loaded config def from plugin (become/su) 18823 1726855008.17419: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 18823 1726855008.17584: Loaded config def from plugin (become/sudo) 18823 1726855008.17586: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 18823 1726855008.17620: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18823 1726855008.17969: in VariableManager get_vars() 18823 1726855008.17993: done with get_vars() 18823 1726855008.18136: trying /usr/local/lib/python3.12/site-packages/ansible/modules 18823 1726855008.21285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 18823 1726855008.21410: in VariableManager get_vars() 18823 1726855008.21416: done with get_vars() 18823 1726855008.21419: variable 'playbook_dir' from source: magic vars 18823 1726855008.21420: variable 'ansible_playbook_python' from source: magic vars 18823 1726855008.21421: variable 'ansible_config_file' from source: magic vars 18823 1726855008.21421: variable 'groups' from source: magic vars 18823 1726855008.21422: variable 'omit' from source: magic vars 18823 1726855008.21423: variable 'ansible_version' from source: magic vars 18823 1726855008.21424: variable 'ansible_check_mode' from source: magic vars 18823 1726855008.21424: variable 'ansible_diff_mode' from source: magic vars 18823 1726855008.21425: variable 'ansible_forks' from source: magic vars 18823 1726855008.21426: variable 'ansible_inventory_sources' from source: magic vars 18823 1726855008.21426: variable 'ansible_skip_tags' from source: magic vars 18823 1726855008.21427: variable 'ansible_limit' from source: magic vars 18823 1726855008.21428: variable 'ansible_run_tags' from source: magic vars 18823 1726855008.21428: variable 'ansible_verbosity' from source: magic vars 18823 1726855008.21465: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 18823 1726855008.22177: in VariableManager get_vars() 18823 1726855008.22195: done with get_vars() 18823 1726855008.22233: in VariableManager get_vars() 18823 1726855008.22262: done with get_vars() 18823 1726855008.22302: in VariableManager get_vars() 18823 1726855008.22314: done with get_vars() 18823 1726855008.22344: in VariableManager get_vars() 18823 1726855008.22363: done with get_vars() 18823 1726855008.22436: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18823 1726855008.22651: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18823 1726855008.22808: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18823 1726855008.23497: in VariableManager get_vars() 18823 1726855008.23517: done with get_vars() 18823 1726855008.23960: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 18823 1726855008.24110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18823 1726855008.25451: in VariableManager get_vars() 18823 1726855008.25469: done with get_vars() 18823 1726855008.25611: in VariableManager get_vars() 18823 1726855008.25615: done with get_vars() 18823 1726855008.25618: variable 'playbook_dir' from source: magic vars 18823 1726855008.25619: variable 'ansible_playbook_python' from source: magic vars 18823 1726855008.25619: variable 'ansible_config_file' from source: magic vars 18823 1726855008.25620: variable 'groups' from source: magic vars 18823 1726855008.25621: variable 'omit' from source: magic vars 18823 1726855008.25621: variable 'ansible_version' from source: magic vars 18823 1726855008.25622: variable 'ansible_check_mode' from source: magic vars 18823 1726855008.25623: variable 'ansible_diff_mode' from source: magic vars 18823 1726855008.25624: variable 'ansible_forks' from source: magic vars 18823 1726855008.25624: variable 'ansible_inventory_sources' from source: magic vars 18823 1726855008.25625: variable 'ansible_skip_tags' from source: magic vars 18823 1726855008.25626: variable 'ansible_limit' from source: magic vars 18823 1726855008.25626: variable 'ansible_run_tags' from source: magic vars 18823 1726855008.25627: variable 'ansible_verbosity' from source: magic vars 18823 1726855008.25659: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 18823 1726855008.25741: in VariableManager get_vars() 18823 1726855008.25744: done with get_vars() 18823 1726855008.25746: variable 'playbook_dir' from source: magic vars 18823 1726855008.25747: variable 'ansible_playbook_python' from source: magic vars 18823 1726855008.25748: variable 'ansible_config_file' from source: magic vars 18823 1726855008.25749: variable 'groups' from source: magic vars 18823 1726855008.25750: variable 'omit' from source: magic vars 18823 1726855008.25751: variable 'ansible_version' from source: magic vars 18823 1726855008.25751: variable 'ansible_check_mode' from source: magic vars 18823 1726855008.25752: variable 'ansible_diff_mode' from source: magic vars 18823 1726855008.25753: variable 'ansible_forks' from source: magic vars 18823 1726855008.25753: variable 'ansible_inventory_sources' from source: magic vars 18823 1726855008.25754: variable 'ansible_skip_tags' from source: magic vars 18823 1726855008.25755: variable 'ansible_limit' from source: magic vars 18823 1726855008.25756: variable 'ansible_run_tags' from source: magic vars 18823 1726855008.25756: variable 'ansible_verbosity' from source: magic vars 18823 1726855008.25791: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 18823 1726855008.25879: in VariableManager get_vars() 18823 1726855008.25894: done with get_vars() 18823 1726855008.25943: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18823 1726855008.26063: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18823 1726855008.26147: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18823 1726855008.26655: in VariableManager get_vars() 18823 1726855008.26672: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18823 1726855008.28212: in VariableManager get_vars() 18823 1726855008.28239: done with get_vars() 18823 1726855008.28275: in VariableManager get_vars() 18823 1726855008.28278: done with get_vars() 18823 1726855008.28280: variable 'playbook_dir' from source: magic vars 18823 1726855008.28281: variable 'ansible_playbook_python' from source: magic vars 18823 1726855008.28283: variable 'ansible_config_file' from source: magic vars 18823 1726855008.28284: variable 'groups' from source: magic vars 18823 1726855008.28284: variable 'omit' from source: magic vars 18823 1726855008.28285: variable 'ansible_version' from source: magic vars 18823 1726855008.28286: variable 'ansible_check_mode' from source: magic vars 18823 1726855008.28287: variable 'ansible_diff_mode' from source: magic vars 18823 1726855008.28289: variable 'ansible_forks' from source: magic vars 18823 1726855008.28290: variable 'ansible_inventory_sources' from source: magic vars 18823 1726855008.28290: variable 'ansible_skip_tags' from source: magic vars 18823 1726855008.28291: variable 'ansible_limit' from source: magic vars 18823 1726855008.28292: variable 'ansible_run_tags' from source: magic vars 18823 1726855008.28292: variable 'ansible_verbosity' from source: magic vars 18823 1726855008.28324: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 18823 1726855008.28403: in VariableManager get_vars() 18823 1726855008.28415: done with get_vars() 18823 1726855008.28461: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18823 1726855008.30203: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18823 1726855008.30281: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18823 1726855008.30701: in VariableManager get_vars() 18823 1726855008.30721: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18823 1726855008.32363: in VariableManager get_vars() 18823 1726855008.32386: done with get_vars() 18823 1726855008.32422: in VariableManager get_vars() 18823 1726855008.32433: done with get_vars() 18823 1726855008.32500: in VariableManager get_vars() 18823 1726855008.32513: done with get_vars() 18823 1726855008.32611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 18823 1726855008.32625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 18823 1726855008.32860: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 18823 1726855008.33034: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 18823 1726855008.33037: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 18823 1726855008.33067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 18823 1726855008.33097: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 18823 1726855008.33279: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 18823 1726855008.33341: Loaded config def from plugin (callback/default) 18823 1726855008.33344: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18823 1726855008.34545: Loaded config def from plugin (callback/junit) 18823 1726855008.34548: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18823 1726855008.34600: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 18823 1726855008.34673: Loaded config def from plugin (callback/minimal) 18823 1726855008.34675: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18823 1726855008.34718: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18823 1726855008.34786: Loaded config def from plugin (callback/tree) 18823 1726855008.34790: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 18823 1726855008.34918: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 18823 1726855008.34921: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18823 1726855008.34945: in VariableManager get_vars() 18823 1726855008.34958: done with get_vars() 18823 1726855008.34963: in VariableManager get_vars() 18823 1726855008.34971: done with get_vars() 18823 1726855008.34975: variable 'omit' from source: magic vars 18823 1726855008.35019: in VariableManager get_vars() 18823 1726855008.35036: done with get_vars() 18823 1726855008.35062: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 18823 1726855008.35634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 18823 1726855008.35715: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 18823 1726855008.35745: getting the remaining hosts for this loop 18823 1726855008.35747: done getting the remaining hosts for this loop 18823 1726855008.35749: getting the next task for host managed_node2 18823 1726855008.35760: done getting next task for host managed_node2 18823 1726855008.35762: ^ task is: TASK: Gathering Facts 18823 1726855008.35764: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855008.35771: getting variables 18823 1726855008.35772: in VariableManager get_vars() 18823 1726855008.35782: Calling all_inventory to load vars for managed_node2 18823 1726855008.35784: Calling groups_inventory to load vars for managed_node2 18823 1726855008.35789: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855008.35801: Calling all_plugins_play to load vars for managed_node2 18823 1726855008.35812: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855008.35815: Calling groups_plugins_play to load vars for managed_node2 18823 1726855008.35848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855008.35913: done with get_vars() 18823 1726855008.35920: done getting variables 18823 1726855008.35990: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Friday 20 September 2024 13:56:48 -0400 (0:00:00.011) 0:00:00.011 ****** 18823 1726855008.36011: entering _queue_task() for managed_node2/gather_facts 18823 1726855008.36012: Creating lock for gather_facts 18823 1726855008.36599: worker is 1 (out of 1 available) 18823 1726855008.36606: exiting _queue_task() for managed_node2/gather_facts 18823 1726855008.36617: done queuing things up, now waiting for results queue to drain 18823 1726855008.36618: waiting for pending results... 18823 1726855008.36714: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855008.36805: in run() - task 0affcc66-ac2b-d391-077c-00000000007c 18823 1726855008.36845: variable 'ansible_search_path' from source: unknown 18823 1726855008.36901: calling self._execute() 18823 1726855008.36972: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855008.37033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855008.37037: variable 'omit' from source: magic vars 18823 1726855008.37109: variable 'omit' from source: magic vars 18823 1726855008.37149: variable 'omit' from source: magic vars 18823 1726855008.37196: variable 'omit' from source: magic vars 18823 1726855008.37253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855008.37300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855008.37327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855008.37390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855008.37393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855008.37410: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855008.37418: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855008.37426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855008.37541: Set connection var ansible_timeout to 10 18823 1726855008.37553: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855008.37578: Set connection var ansible_shell_type to sh 18823 1726855008.37581: Set connection var ansible_shell_executable to /bin/sh 18823 1726855008.37583: Set connection var ansible_connection to ssh 18823 1726855008.37602: Set connection var ansible_pipelining to False 18823 1726855008.37689: variable 'ansible_shell_executable' from source: unknown 18823 1726855008.37692: variable 'ansible_connection' from source: unknown 18823 1726855008.37694: variable 'ansible_module_compression' from source: unknown 18823 1726855008.37696: variable 'ansible_shell_type' from source: unknown 18823 1726855008.37699: variable 'ansible_shell_executable' from source: unknown 18823 1726855008.37701: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855008.37703: variable 'ansible_pipelining' from source: unknown 18823 1726855008.37711: variable 'ansible_timeout' from source: unknown 18823 1726855008.37713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855008.37865: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855008.37879: variable 'omit' from source: magic vars 18823 1726855008.37891: starting attempt loop 18823 1726855008.37905: running the handler 18823 1726855008.37930: variable 'ansible_facts' from source: unknown 18823 1726855008.37951: _low_level_execute_command(): starting 18823 1726855008.38014: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855008.38808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855008.38890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855008.38905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855008.38922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855008.39028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855008.41117: stdout chunk (state=3): >>>/root <<< 18823 1726855008.41121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855008.41124: stdout chunk (state=3): >>><<< 18823 1726855008.41126: stderr chunk (state=3): >>><<< 18823 1726855008.41147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855008.41166: _low_level_execute_command(): starting 18823 1726855008.41295: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180 `" && echo ansible-tmp-1726855008.4115393-18859-101980585277180="` echo /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180 `" ) && sleep 0' 18823 1726855008.42259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855008.42274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855008.42291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855008.42310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855008.42369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855008.42432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855008.42485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855008.42559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855008.44494: stdout chunk (state=3): >>>ansible-tmp-1726855008.4115393-18859-101980585277180=/root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180 <<< 18823 1726855008.44649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855008.44652: stdout chunk (state=3): >>><<< 18823 1726855008.44655: stderr chunk (state=3): >>><<< 18823 1726855008.44670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855008.4115393-18859-101980585277180=/root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855008.44992: variable 'ansible_module_compression' from source: unknown 18823 1726855008.44996: ANSIBALLZ: Using generic lock for ansible.legacy.setup 18823 1726855008.44999: ANSIBALLZ: Acquiring lock 18823 1726855008.45002: ANSIBALLZ: Lock acquired: 140142269228544 18823 1726855008.45004: ANSIBALLZ: Creating module 18823 1726855009.07006: ANSIBALLZ: Writing module into payload 18823 1726855009.07177: ANSIBALLZ: Writing module 18823 1726855009.07208: ANSIBALLZ: Renaming module 18823 1726855009.07219: ANSIBALLZ: Done creating module 18823 1726855009.07255: variable 'ansible_facts' from source: unknown 18823 1726855009.07266: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855009.07277: _low_level_execute_command(): starting 18823 1726855009.07289: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 18823 1726855009.08013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855009.08060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855009.08092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855009.08098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855009.08349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855009.10104: stdout chunk (state=3): >>>PLATFORM <<< 18823 1726855009.10475: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 18823 1726855009.10478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855009.10481: stdout chunk (state=3): >>><<< 18823 1726855009.10483: stderr chunk (state=3): >>><<< 18823 1726855009.10485: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855009.10492 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 18823 1726855009.10572: _low_level_execute_command(): starting 18823 1726855009.10583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 18823 1726855009.10941: Sending initial data 18823 1726855009.10945: Sent initial data (1181 bytes) 18823 1726855009.12111: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855009.12161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855009.12181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855009.12214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855009.12323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855009.15918: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 18823 1726855009.16237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855009.16281: stderr chunk (state=3): >>><<< 18823 1726855009.16285: stdout chunk (state=3): >>><<< 18823 1726855009.16289: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855009.16603: variable 'ansible_facts' from source: unknown 18823 1726855009.16606: variable 'ansible_facts' from source: unknown 18823 1726855009.16608: variable 'ansible_module_compression' from source: unknown 18823 1726855009.16610: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855009.16612: variable 'ansible_facts' from source: unknown 18823 1726855009.16992: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/AnsiballZ_setup.py 18823 1726855009.17446: Sending initial data 18823 1726855009.17450: Sent initial data (154 bytes) 18823 1726855009.18896: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855009.19029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855009.19214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855009.19218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855009.19407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855009.19447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855009.21065: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18823 1726855009.21072: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855009.21317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/AnsiballZ_setup.py" <<< 18823 1726855009.21320: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpv8rfah6d /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/AnsiballZ_setup.py <<< 18823 1726855009.21323: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18823 1726855009.21325: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpv8rfah6d" to remote "/root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/AnsiballZ_setup.py" <<< 18823 1726855009.24309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855009.24374: stderr chunk (state=3): >>><<< 18823 1726855009.24692: stdout chunk (state=3): >>><<< 18823 1726855009.24695: done transferring module to remote 18823 1726855009.24697: _low_level_execute_command(): starting 18823 1726855009.24699: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/ /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/AnsiballZ_setup.py && sleep 0' 18823 1726855009.25948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855009.26081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855009.26161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855009.26301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855009.26485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855009.28372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855009.28375: stdout chunk (state=3): >>><<< 18823 1726855009.28378: stderr chunk (state=3): >>><<< 18823 1726855009.28397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855009.28406: _low_level_execute_command(): starting 18823 1726855009.28531: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/AnsiballZ_setup.py && sleep 0' 18823 1726855009.30346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855009.30362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855009.30379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855009.30594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855009.30608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855009.30732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855009.32958: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 18823 1726855009.33056: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 18823 1726855009.33152: stdout chunk (state=3): >>>import 'posix' # <<< 18823 1726855009.33173: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 18823 1726855009.33193: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.33205: stdout chunk (state=3): >>>import '_codecs' # <<< 18823 1726855009.33228: stdout chunk (state=3): >>>import 'codecs' # <<< 18823 1726855009.33265: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18823 1726855009.33303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b7684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b737b30> <<< 18823 1726855009.33371: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 18823 1726855009.33400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 18823 1726855009.33403: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b76aa50> <<< 18823 1726855009.33405: stdout chunk (state=3): >>>import '_signal' # <<< 18823 1726855009.33407: stdout chunk (state=3): >>>import '_abc' # <<< 18823 1726855009.33409: stdout chunk (state=3): >>>import 'abc' # <<< 18823 1726855009.33420: stdout chunk (state=3): >>>import 'io' # <<< 18823 1726855009.33450: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 18823 1726855009.33536: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18823 1726855009.33564: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 18823 1726855009.33708: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 18823 1726855009.33711: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 18823 1726855009.33713: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 18823 1726855009.33715: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 18823 1726855009.33717: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 18823 1726855009.33718: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b53d130> <<< 18823 1726855009.33770: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 18823 1726855009.33781: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b53dfa0> <<< 18823 1726855009.33834: stdout chunk (state=3): >>>import 'site' # <<< 18823 1726855009.33837: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18823 1726855009.34224: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18823 1726855009.34229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18823 1726855009.34263: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.34279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18823 1726855009.34313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 18823 1726855009.34324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18823 1726855009.34368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 18823 1726855009.34375: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b57be00> <<< 18823 1726855009.34389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18823 1726855009.34410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 18823 1726855009.34436: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b57bec0> <<< 18823 1726855009.34454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18823 1726855009.34479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18823 1726855009.34504: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18823 1726855009.34567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.34574: stdout chunk (state=3): >>>import 'itertools' # <<< 18823 1726855009.34626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5b37d0> <<< 18823 1726855009.34642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5b3e60> import '_collections' # <<< 18823 1726855009.34707: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b593ad0> <<< 18823 1726855009.34732: stdout chunk (state=3): >>>import '_functools' # <<< 18823 1726855009.34745: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5911f0> <<< 18823 1726855009.34838: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b578fb0> <<< 18823 1726855009.34889: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18823 1726855009.34892: stdout chunk (state=3): >>>import '_sre' # <<< 18823 1726855009.34916: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18823 1726855009.34955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18823 1726855009.34993: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5d3770> <<< 18823 1726855009.35026: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5d2390> <<< 18823 1726855009.35040: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b592090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5d0bc0> <<< 18823 1726855009.35095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18823 1726855009.35127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b608800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b578230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 18823 1726855009.35182: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.35208: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b608cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b608b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.35254: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b608ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b576d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 18823 1726855009.35295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 18823 1726855009.35315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b609580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b609250> import 'importlib.machinery' # <<< 18823 1726855009.35366: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 18823 1726855009.35396: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b60a480> import 'importlib.util' # <<< 18823 1726855009.35425: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18823 1726855009.35445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18823 1726855009.35504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b6206b0> <<< 18823 1726855009.35553: stdout chunk (state=3): >>>import 'errno' # <<< 18823 1726855009.35557: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b621d90> <<< 18823 1726855009.35590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 18823 1726855009.35613: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b622c30> <<< 18823 1726855009.35670: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b623290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b622180> <<< 18823 1726855009.35673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 18823 1726855009.35845: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b623d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b623440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b60a4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18823 1726855009.35896: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b323bc0> <<< 18823 1726855009.35943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34c410> <<< 18823 1726855009.35980: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34c6e0> <<< 18823 1726855009.36007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18823 1726855009.36072: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.36195: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34d010> <<< 18823 1726855009.36314: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.36459: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b321d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18823 1726855009.36479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b60abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18823 1726855009.36540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.36578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18823 1726855009.36608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b37b140> <<< 18823 1726855009.36660: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18823 1726855009.36711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.36718: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18823 1726855009.36762: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b39b500> <<< 18823 1726855009.36777: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18823 1726855009.36818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18823 1726855009.36864: stdout chunk (state=3): >>>import 'ntpath' # <<< 18823 1726855009.36908: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3fc260> <<< 18823 1726855009.36933: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18823 1726855009.36942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18823 1726855009.36962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18823 1726855009.37010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18823 1726855009.37092: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3fe9c0> <<< 18823 1726855009.37165: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3fc380> <<< 18823 1726855009.37210: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3c5280> <<< 18823 1726855009.37253: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad29370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b39a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34fd40> <<< 18823 1726855009.37432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18823 1726855009.37454: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f240b39a420> <<< 18823 1726855009.37737: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_9kw_mjv2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 18823 1726855009.37863: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.37908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18823 1726855009.37946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18823 1726855009.38049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18823 1726855009.38053: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad8eff0> <<< 18823 1726855009.38070: stdout chunk (state=3): >>>import '_typing' # <<< 18823 1726855009.38259: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad6dee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad6d040> # zipimport: zlib available <<< 18823 1726855009.38295: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 18823 1726855009.38325: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.38348: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 18823 1726855009.40367: stdout chunk (state=3): >>># zipimport: zlib available<<< 18823 1726855009.40375: stdout chunk (state=3): >>> <<< 18823 1726855009.41914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad8cec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.41918: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18823 1726855009.41923: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 18823 1726855009.41956: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.41978: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240adc2990> <<< 18823 1726855009.41997: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc2720> <<< 18823 1726855009.42203: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc2030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc2480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad8fc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240adc3710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240adc3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18823 1726855009.42251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 18823 1726855009.42313: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc3e90> <<< 18823 1726855009.42379: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18823 1726855009.42394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18823 1726855009.42439: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac29b50><<< 18823 1726855009.42471: stdout chunk (state=3): >>> <<< 18823 1726855009.42476: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855009.42517: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac2b800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 18823 1726855009.42603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2c200> <<< 18823 1726855009.42638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18823 1726855009.42685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 18823 1726855009.42726: stdout chunk (state=3): >>> import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2d100><<< 18823 1726855009.42744: stdout chunk (state=3): >>> <<< 18823 1726855009.42774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 18823 1726855009.42845: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 18823 1726855009.42890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 18823 1726855009.42894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 18823 1726855009.42960: stdout chunk (state=3): >>> <<< 18823 1726855009.42974: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2fe00><<< 18823 1726855009.42977: stdout chunk (state=3): >>> <<< 18823 1726855009.43042: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855009.43045: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855009.43070: stdout chunk (state=3): >>> import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b37b0b0> <<< 18823 1726855009.43113: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2e0c0><<< 18823 1726855009.43144: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 18823 1726855009.43154: stdout chunk (state=3): >>> <<< 18823 1726855009.43222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 18823 1726855009.43225: stdout chunk (state=3): >>> <<< 18823 1726855009.43263: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py<<< 18823 1726855009.43268: stdout chunk (state=3): >>> <<< 18823 1726855009.43452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 18823 1726855009.43459: stdout chunk (state=3): >>> <<< 18823 1726855009.43472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 18823 1726855009.43497: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac37d40><<< 18823 1726855009.43514: stdout chunk (state=3): >>> import '_tokenize' # <<< 18823 1726855009.43558: stdout chunk (state=3): >>> <<< 18823 1726855009.43630: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac36810><<< 18823 1726855009.43642: stdout chunk (state=3): >>> <<< 18823 1726855009.43645: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac36570><<< 18823 1726855009.43683: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 18823 1726855009.43797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18823 1726855009.43863: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac36ae0><<< 18823 1726855009.43866: stdout chunk (state=3): >>> <<< 18823 1726855009.43903: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2e5d0> <<< 18823 1726855009.43930: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855009.44003: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac7ba10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 18823 1726855009.44006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 18823 1726855009.44052: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18823 1726855009.44076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 18823 1726855009.44143: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 18823 1726855009.44158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18823 1726855009.44201: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.44240: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac7dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18823 1726855009.44344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.44385: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.44404: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac801a0> <<< 18823 1726855009.44435: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18823 1726855009.44517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 18823 1726855009.44539: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 18823 1726855009.44584: stdout chunk (state=3): >>> <<< 18823 1726855009.44640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 18823 1726855009.44657: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac83950> <<< 18823 1726855009.44851: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac80350> <<< 18823 1726855009.44945: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.44961: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855009.45023: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac847d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855009.45052: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac849b0><<< 18823 1726855009.45122: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.45182: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac84cb0> <<< 18823 1726855009.45205: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 18823 1726855009.45238: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18823 1726855009.45279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 18823 1726855009.45283: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.45313: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab10380> <<< 18823 1726855009.45516: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.45567: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac86b10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac87e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac86720> # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.45574: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18823 1726855009.45650: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.45741: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.45767: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 18823 1726855009.45797: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 18823 1726855009.45808: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.45918: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.46034: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.46570: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.47420: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab15970> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab16780> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab118e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 18823 1726855009.47428: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.47563: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.47714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 18823 1726855009.47732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab16930> <<< 18823 1726855009.47805: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.48191: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.48635: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.48732: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.48778: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18823 1726855009.48844: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.48863: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18823 1726855009.48876: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.48943: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.49185: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 18823 1726855009.49382: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.49627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18823 1726855009.49701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 18823 1726855009.49745: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab17b60> <<< 18823 1726855009.49928: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 18823 1726855009.49933: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 18823 1726855009.49945: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.49980: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.50019: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 18823 1726855009.50152: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.50177: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.50236: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18823 1726855009.50509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab22330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab1dd90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.50557: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.50590: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.50624: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.50699: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18823 1726855009.50746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18823 1726855009.50807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18823 1726855009.50833: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac0ab70> <<< 18823 1726855009.50913: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240acfe840> <<< 18823 1726855009.50954: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab22030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab154f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 18823 1726855009.51034: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.51037: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18823 1726855009.51132: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 18823 1726855009.51137: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51190: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51251: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51340: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.51368: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51452: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # <<< 18823 1726855009.51467: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51521: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51596: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51611: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.51669: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 18823 1726855009.51830: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.52137: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.52140: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 18823 1726855009.52143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 18823 1726855009.52154: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 18823 1726855009.52176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 18823 1726855009.52198: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb6630> <<< 18823 1726855009.52234: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 18823 1726855009.52238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 18823 1726855009.52253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 18823 1726855009.52292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 18823 1726855009.52338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 18823 1726855009.52427: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a758200> <<< 18823 1726855009.52453: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a7587d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240aba7530> <<< 18823 1726855009.52458: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb71d0> <<< 18823 1726855009.52475: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb4d10> <<< 18823 1726855009.52508: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb5700> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18823 1726855009.52569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 18823 1726855009.52592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 18823 1726855009.52656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 18823 1726855009.52711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a75b590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a75ae40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a75b020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a75a270> <<< 18823 1726855009.52732: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 18823 1726855009.52829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 18823 1726855009.52871: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a75b740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 18823 1726855009.52898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 18823 1726855009.52942: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a7be270> <<< 18823 1726855009.52952: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7bc290> <<< 18823 1726855009.53204: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb4920> import 'ansible.module_utils.facts.timeout' # <<< 18823 1726855009.53211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 18823 1726855009.53214: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 18823 1726855009.53232: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.53274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 18823 1726855009.53510: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 18823 1726855009.53531: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.53578: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 18823 1726855009.53592: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.53667: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.53706: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.53777: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.53818: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 18823 1726855009.53829: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.54294: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.54716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 18823 1726855009.54759: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.54771: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.54868: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.54977: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 18823 1726855009.55000: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.55036: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.55194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 18823 1726855009.55199: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 18823 1726855009.55218: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.55244: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 18823 1726855009.55247: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.55329: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.55494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 18823 1726855009.55513: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7bf8c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18823 1726855009.55622: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7beed0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 18823 1726855009.55737: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.55754: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 18823 1726855009.55831: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.55848: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.56010: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 18823 1726855009.56013: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.56310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.56375: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a7f6450> <<< 18823 1726855009.56720: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7bfb60> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 18823 1726855009.56759: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.56854: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.56961: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.57114: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 18823 1726855009.57118: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 18823 1726855009.57170: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.57190: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 18823 1726855009.57237: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.57254: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.57299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 18823 1726855009.57333: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.57337: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855009.57405: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a80a060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7e6cc0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 18823 1726855009.57446: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.57593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 18823 1726855009.57597: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.57714: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.57804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 18823 1726855009.57808: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.58043: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.58060: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.58079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 18823 1726855009.58096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 18823 1726855009.58162: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.58509: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 18823 1726855009.58547: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.58666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 18823 1726855009.58821: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.58825: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.59298: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.59801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 18823 1726855009.59918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.60106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 18823 1726855009.60119: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.60216: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18823 1726855009.60256: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.60579: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 18823 1726855009.60616: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.60719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 18823 1726855009.60762: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.60860: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.61059: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.61258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 18823 1726855009.61282: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.61311: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.61378: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 18823 1726855009.61467: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.61471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 18823 1726855009.61473: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.61814: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 18823 1726855009.61817: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.61859: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 18823 1726855009.61873: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.62133: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.62402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 18823 1726855009.62508: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.62522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 18823 1726855009.62568: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.62604: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 18823 1726855009.62616: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.62723: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 18823 1726855009.62727: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.62768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 18823 1726855009.62863: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.62940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 18823 1726855009.63054: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 18823 1726855009.63157: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.63310: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.63329: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 18823 1726855009.63342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 18823 1726855009.63403: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.63484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 18823 1726855009.63648: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.63836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 18823 1726855009.63916: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.64042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855009.64050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 18823 1726855009.64136: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.64215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 18823 1726855009.64239: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 18823 1726855009.64409: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18823 1726855009.64479: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855009.64781: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 18823 1726855009.64808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 18823 1726855009.64812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a5a2a80> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5a32f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5a0950> <<< 18823 1726855009.81774: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py<<< 18823 1726855009.81868: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5e92e0> <<< 18823 1726855009.81874: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 18823 1726855009.81904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 18823 1726855009.81962: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5ea060> <<< 18823 1726855009.82027: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 18823 1726855009.82052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855009.82147: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 18823 1726855009.82151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7fc830> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7fc3b0> <<< 18823 1726855009.82518: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 18823 1726855010.02982: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualiz<<< 18823 1726855010.03023: stdout chunk (state=3): >>>ation_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.58154296875, "5m": 0.40478515625, "15m": 0.20556640625}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "49", "epoch": "1726855009", "epoch_int": "1726855009", "date": "2024-09-20", "time": "13:56:49", "iso8601_micro": "2024-09-20T17:56:49.657922Z", "iso8601": "2024-09-20T17:56:49Z", "iso8601_basic": "20240920T135649657922", "iso8601_basic_short": "20240920T135649", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a<<< 18823 1726855010.03032: stdout chunk (state=3): >>>-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 792, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794885632, "block_size": 4096, "block_total": 65519099, "block_available": 63914767, "block_used": 1604332, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855010.03710: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 18823 1726855010.03714: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path<<< 18823 1726855010.03767: stdout chunk (state=3): >>> # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport<<< 18823 1726855010.03775: stdout chunk (state=3): >>> # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath <<< 18823 1726855010.03797: stdout chunk (state=3): >>># cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser <<< 18823 1726855010.03847: stdout chunk (state=3): >>># cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib <<< 18823 1726855010.03852: stdout chunk (state=3): >>># cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 18823 1726855010.03855: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile <<< 18823 1726855010.03884: stdout chunk (state=3): >>># cleanup[2] removing threading <<< 18823 1726855010.03891: stdout chunk (state=3): >>># cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing <<< 18823 1726855010.03927: stdout chunk (state=3): >>># destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 18823 1726855010.03931: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 18823 1726855010.03961: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters <<< 18823 1726855010.03990: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 18823 1726855010.04011: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils<<< 18823 1726855010.04022: stdout chunk (state=3): >>> # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing <<< 18823 1726855010.04038: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version <<< 18823 1726855010.04078: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd <<< 18823 1726855010.04082: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware <<< 18823 1726855010.04119: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 18823 1726855010.04538: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 18823 1726855010.04550: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util<<< 18823 1726855010.04586: stdout chunk (state=3): >>> # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 18823 1726855010.04631: stdout chunk (state=3): >>># destroy ntpath <<< 18823 1726855010.04671: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 18823 1726855010.04700: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 18823 1726855010.04739: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 18823 1726855010.04749: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 18823 1726855010.04809: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 18823 1726855010.04862: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 18823 1726855010.04913: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 18823 1726855010.04917: stdout chunk (state=3): >>># destroy _ssl <<< 18823 1726855010.04979: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 18823 1726855010.04983: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing <<< 18823 1726855010.05105: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 18823 1726855010.05113: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 18823 1726855010.05174: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 18823 1726855010.05179: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 18823 1726855010.05199: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 18823 1726855010.05203: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18823 1726855010.05486: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 18823 1726855010.05522: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18823 1726855010.05641: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna <<< 18823 1726855010.05661: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 18823 1726855010.05716: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 18823 1726855010.05721: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 18823 1726855010.05744: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 18823 1726855010.05786: stdout chunk (state=3): >>># clear sys.audit hooks <<< 18823 1726855010.06219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855010.06245: stderr chunk (state=3): >>><<< 18823 1726855010.06248: stdout chunk (state=3): >>><<< 18823 1726855010.06363: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b7684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b737b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b76aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b53d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b53dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b57be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b57bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5b37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5b3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b593ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5911f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b578fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5d3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5d2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b592090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b5d0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b608800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b578230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b608cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b608b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b608ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b576d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b609580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b609250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b60a480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b6206b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b621d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b622c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b623290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b622180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b623d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b623440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b60a4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b323bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b34da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b321d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b60abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b37b140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b39b500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3fc260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3fe9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3fc380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b3c5280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad29370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b39a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240b34fd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f240b39a420> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_9kw_mjv2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad8eff0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad6dee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad6d040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad8cec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240adc2990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc2720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc2030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc2480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ad8fc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240adc3710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240adc3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240adc3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac29b50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac2b800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2c200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240b37b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac37d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac36810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac36570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac36ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac2e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac7ba10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac7dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac801a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac83950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac80350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac847d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac849b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac84cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac7c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab10380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac86b10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ac87e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac86720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab15970> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab16780> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab118e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab16930> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab17b60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240ab22330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab1dd90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ac0ab70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240acfe840> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab22030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240ab154f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb6630> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a758200> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a7587d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240aba7530> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb71d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb4d10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb5700> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a75b590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a75ae40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a75b020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a75a270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a75b740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a7be270> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7bc290> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240abb4920> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7bf8c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7beed0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a7f6450> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7bfb60> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a80a060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7e6cc0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f240a5a2a80> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5a32f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5a0950> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5e92e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a5ea060> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7fc830> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f240a7fc3b0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.58154296875, "5m": 0.40478515625, "15m": 0.20556640625}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "49", "epoch": "1726855009", "epoch_int": "1726855009", "date": "2024-09-20", "time": "13:56:49", "iso8601_micro": "2024-09-20T17:56:49.657922Z", "iso8601": "2024-09-20T17:56:49Z", "iso8601_basic": "20240920T135649657922", "iso8601_basic_short": "20240920T135649", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 792, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794885632, "block_size": 4096, "block_total": 65519099, "block_available": 63914767, "block_used": 1604332, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 18823 1726855010.07558: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855010.07563: _low_level_execute_command(): starting 18823 1726855010.07566: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855008.4115393-18859-101980585277180/ > /dev/null 2>&1 && sleep 0' 18823 1726855010.08160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855010.08202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855010.08231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855010.08311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855010.08333: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855010.08377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.08459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855010.10394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855010.10398: stdout chunk (state=3): >>><<< 18823 1726855010.10400: stderr chunk (state=3): >>><<< 18823 1726855010.10404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855010.10406: handler run complete 18823 1726855010.10473: variable 'ansible_facts' from source: unknown 18823 1726855010.10582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.10940: variable 'ansible_facts' from source: unknown 18823 1726855010.11039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.11292: attempt loop complete, returning result 18823 1726855010.11295: _execute() done 18823 1726855010.11297: dumping result to json 18823 1726855010.11300: done dumping result, returning 18823 1726855010.11302: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-00000000007c] 18823 1726855010.11304: sending task result for task 0affcc66-ac2b-d391-077c-00000000007c 18823 1726855010.12336: done sending task result for task 0affcc66-ac2b-d391-077c-00000000007c 18823 1726855010.12347: WORKER PROCESS EXITING ok: [managed_node2] 18823 1726855010.12456: no more pending results, returning what we have 18823 1726855010.12458: results queue empty 18823 1726855010.12459: checking for any_errors_fatal 18823 1726855010.12460: done checking for any_errors_fatal 18823 1726855010.12461: checking for max_fail_percentage 18823 1726855010.12462: done checking for max_fail_percentage 18823 1726855010.12463: checking to see if all hosts have failed and the running result is not ok 18823 1726855010.12463: done checking to see if all hosts have failed 18823 1726855010.12464: getting the remaining hosts for this loop 18823 1726855010.12466: done getting the remaining hosts for this loop 18823 1726855010.12469: getting the next task for host managed_node2 18823 1726855010.12474: done getting next task for host managed_node2 18823 1726855010.12475: ^ task is: TASK: meta (flush_handlers) 18823 1726855010.12477: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855010.12480: getting variables 18823 1726855010.12481: in VariableManager get_vars() 18823 1726855010.12502: Calling all_inventory to load vars for managed_node2 18823 1726855010.12505: Calling groups_inventory to load vars for managed_node2 18823 1726855010.12507: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855010.12516: Calling all_plugins_play to load vars for managed_node2 18823 1726855010.12518: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855010.12520: Calling groups_plugins_play to load vars for managed_node2 18823 1726855010.12712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.12911: done with get_vars() 18823 1726855010.12921: done getting variables 18823 1726855010.12998: in VariableManager get_vars() 18823 1726855010.13007: Calling all_inventory to load vars for managed_node2 18823 1726855010.13009: Calling groups_inventory to load vars for managed_node2 18823 1726855010.13011: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855010.13016: Calling all_plugins_play to load vars for managed_node2 18823 1726855010.13017: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855010.13020: Calling groups_plugins_play to load vars for managed_node2 18823 1726855010.13153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.13337: done with get_vars() 18823 1726855010.13349: done queuing things up, now waiting for results queue to drain 18823 1726855010.13351: results queue empty 18823 1726855010.13351: checking for any_errors_fatal 18823 1726855010.13353: done checking for any_errors_fatal 18823 1726855010.13354: checking for max_fail_percentage 18823 1726855010.13355: done checking for max_fail_percentage 18823 1726855010.13359: checking to see if all hosts have failed and the running result is not ok 18823 1726855010.13360: done checking to see if all hosts have failed 18823 1726855010.13361: getting the remaining hosts for this loop 18823 1726855010.13361: done getting the remaining hosts for this loop 18823 1726855010.13364: getting the next task for host managed_node2 18823 1726855010.13367: done getting next task for host managed_node2 18823 1726855010.13369: ^ task is: TASK: Include the task 'el_repo_setup.yml' 18823 1726855010.13371: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855010.13372: getting variables 18823 1726855010.13373: in VariableManager get_vars() 18823 1726855010.13380: Calling all_inventory to load vars for managed_node2 18823 1726855010.13382: Calling groups_inventory to load vars for managed_node2 18823 1726855010.13384: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855010.13390: Calling all_plugins_play to load vars for managed_node2 18823 1726855010.13392: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855010.13395: Calling groups_plugins_play to load vars for managed_node2 18823 1726855010.13579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.13782: done with get_vars() 18823 1726855010.13792: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Friday 20 September 2024 13:56:50 -0400 (0:00:01.778) 0:00:01.790 ****** 18823 1726855010.13873: entering _queue_task() for managed_node2/include_tasks 18823 1726855010.13875: Creating lock for include_tasks 18823 1726855010.14310: worker is 1 (out of 1 available) 18823 1726855010.14321: exiting _queue_task() for managed_node2/include_tasks 18823 1726855010.14332: done queuing things up, now waiting for results queue to drain 18823 1726855010.14336: waiting for pending results... 18823 1726855010.14486: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 18823 1726855010.14592: in run() - task 0affcc66-ac2b-d391-077c-000000000006 18823 1726855010.14622: variable 'ansible_search_path' from source: unknown 18823 1726855010.14661: calling self._execute() 18823 1726855010.14750: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855010.14761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855010.14832: variable 'omit' from source: magic vars 18823 1726855010.14899: _execute() done 18823 1726855010.14909: dumping result to json 18823 1726855010.14917: done dumping result, returning 18823 1726855010.14925: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0affcc66-ac2b-d391-077c-000000000006] 18823 1726855010.14944: sending task result for task 0affcc66-ac2b-d391-077c-000000000006 18823 1726855010.15172: done sending task result for task 0affcc66-ac2b-d391-077c-000000000006 18823 1726855010.15176: WORKER PROCESS EXITING 18823 1726855010.15221: no more pending results, returning what we have 18823 1726855010.15227: in VariableManager get_vars() 18823 1726855010.15269: Calling all_inventory to load vars for managed_node2 18823 1726855010.15273: Calling groups_inventory to load vars for managed_node2 18823 1726855010.15277: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855010.15293: Calling all_plugins_play to load vars for managed_node2 18823 1726855010.15297: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855010.15300: Calling groups_plugins_play to load vars for managed_node2 18823 1726855010.15638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.15862: done with get_vars() 18823 1726855010.15869: variable 'ansible_search_path' from source: unknown 18823 1726855010.15883: we have included files to process 18823 1726855010.15884: generating all_blocks data 18823 1726855010.15885: done generating all_blocks data 18823 1726855010.15886: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18823 1726855010.15890: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18823 1726855010.15893: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18823 1726855010.16597: in VariableManager get_vars() 18823 1726855010.16619: done with get_vars() 18823 1726855010.16631: done processing included file 18823 1726855010.16633: iterating over new_blocks loaded from include file 18823 1726855010.16634: in VariableManager get_vars() 18823 1726855010.16644: done with get_vars() 18823 1726855010.16646: filtering new block on tags 18823 1726855010.16660: done filtering new block on tags 18823 1726855010.16663: in VariableManager get_vars() 18823 1726855010.16672: done with get_vars() 18823 1726855010.16674: filtering new block on tags 18823 1726855010.16697: done filtering new block on tags 18823 1726855010.16700: in VariableManager get_vars() 18823 1726855010.16712: done with get_vars() 18823 1726855010.16713: filtering new block on tags 18823 1726855010.16734: done filtering new block on tags 18823 1726855010.16736: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 18823 1726855010.16742: extending task lists for all hosts with included blocks 18823 1726855010.16797: done extending task lists 18823 1726855010.16799: done processing included files 18823 1726855010.16800: results queue empty 18823 1726855010.16800: checking for any_errors_fatal 18823 1726855010.16801: done checking for any_errors_fatal 18823 1726855010.16802: checking for max_fail_percentage 18823 1726855010.16803: done checking for max_fail_percentage 18823 1726855010.16804: checking to see if all hosts have failed and the running result is not ok 18823 1726855010.16805: done checking to see if all hosts have failed 18823 1726855010.16805: getting the remaining hosts for this loop 18823 1726855010.16806: done getting the remaining hosts for this loop 18823 1726855010.16809: getting the next task for host managed_node2 18823 1726855010.16813: done getting next task for host managed_node2 18823 1726855010.16815: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 18823 1726855010.16817: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855010.16819: getting variables 18823 1726855010.16820: in VariableManager get_vars() 18823 1726855010.16829: Calling all_inventory to load vars for managed_node2 18823 1726855010.16839: Calling groups_inventory to load vars for managed_node2 18823 1726855010.16841: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855010.16846: Calling all_plugins_play to load vars for managed_node2 18823 1726855010.16848: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855010.16851: Calling groups_plugins_play to load vars for managed_node2 18823 1726855010.17023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.17228: done with get_vars() 18823 1726855010.17237: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:56:50 -0400 (0:00:00.034) 0:00:01.824 ****** 18823 1726855010.17307: entering _queue_task() for managed_node2/setup 18823 1726855010.17682: worker is 1 (out of 1 available) 18823 1726855010.17695: exiting _queue_task() for managed_node2/setup 18823 1726855010.17772: done queuing things up, now waiting for results queue to drain 18823 1726855010.17773: waiting for pending results... 18823 1726855010.17942: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 18823 1726855010.18040: in run() - task 0affcc66-ac2b-d391-077c-00000000008d 18823 1726855010.18044: variable 'ansible_search_path' from source: unknown 18823 1726855010.18092: variable 'ansible_search_path' from source: unknown 18823 1726855010.18103: calling self._execute() 18823 1726855010.18167: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855010.18178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855010.18194: variable 'omit' from source: magic vars 18823 1726855010.18800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855010.20921: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855010.20965: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855010.20992: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855010.21030: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855010.21052: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855010.21113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855010.21136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855010.21154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855010.21181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855010.21196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855010.21311: variable 'ansible_facts' from source: unknown 18823 1726855010.21353: variable 'network_test_required_facts' from source: task vars 18823 1726855010.21382: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 18823 1726855010.21385: variable 'omit' from source: magic vars 18823 1726855010.21414: variable 'omit' from source: magic vars 18823 1726855010.21437: variable 'omit' from source: magic vars 18823 1726855010.21456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855010.21480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855010.21502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855010.21512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855010.21521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855010.21542: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855010.21545: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855010.21548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855010.21616: Set connection var ansible_timeout to 10 18823 1726855010.21622: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855010.21624: Set connection var ansible_shell_type to sh 18823 1726855010.21629: Set connection var ansible_shell_executable to /bin/sh 18823 1726855010.21634: Set connection var ansible_connection to ssh 18823 1726855010.21639: Set connection var ansible_pipelining to False 18823 1726855010.21659: variable 'ansible_shell_executable' from source: unknown 18823 1726855010.21662: variable 'ansible_connection' from source: unknown 18823 1726855010.21665: variable 'ansible_module_compression' from source: unknown 18823 1726855010.21669: variable 'ansible_shell_type' from source: unknown 18823 1726855010.21672: variable 'ansible_shell_executable' from source: unknown 18823 1726855010.21674: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855010.21676: variable 'ansible_pipelining' from source: unknown 18823 1726855010.21679: variable 'ansible_timeout' from source: unknown 18823 1726855010.21681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855010.21774: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855010.21781: variable 'omit' from source: magic vars 18823 1726855010.21788: starting attempt loop 18823 1726855010.21791: running the handler 18823 1726855010.21806: _low_level_execute_command(): starting 18823 1726855010.21811: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855010.22293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855010.22299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.22305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855010.22307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.22374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.22450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855010.24384: stdout chunk (state=3): >>>/root <<< 18823 1726855010.24431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855010.24456: stderr chunk (state=3): >>><<< 18823 1726855010.24459: stdout chunk (state=3): >>><<< 18823 1726855010.24482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855010.24495: _low_level_execute_command(): starting 18823 1726855010.24506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995 `" && echo ansible-tmp-1726855010.2448149-18923-162511578347995="` echo /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995 `" ) && sleep 0' 18823 1726855010.24948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855010.24951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.24954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855010.24956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.25008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855010.25016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.25099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855010.27506: stdout chunk (state=3): >>>ansible-tmp-1726855010.2448149-18923-162511578347995=/root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995 <<< 18823 1726855010.27667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855010.27692: stderr chunk (state=3): >>><<< 18823 1726855010.27696: stdout chunk (state=3): >>><<< 18823 1726855010.27717: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855010.2448149-18923-162511578347995=/root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855010.27761: variable 'ansible_module_compression' from source: unknown 18823 1726855010.27803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855010.27852: variable 'ansible_facts' from source: unknown 18823 1726855010.27984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/AnsiballZ_setup.py 18823 1726855010.28092: Sending initial data 18823 1726855010.28098: Sent initial data (154 bytes) 18823 1726855010.28567: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855010.28570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.28573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855010.28577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855010.28579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.28626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855010.28630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.28716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855010.30985: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18823 1726855010.30993: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855010.31061: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855010.31142: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp38sap9ds /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/AnsiballZ_setup.py <<< 18823 1726855010.31145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/AnsiballZ_setup.py" <<< 18823 1726855010.31271: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp38sap9ds" to remote "/root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/AnsiballZ_setup.py" <<< 18823 1726855010.31274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/AnsiballZ_setup.py" <<< 18823 1726855010.32532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855010.32545: stderr chunk (state=3): >>><<< 18823 1726855010.32548: stdout chunk (state=3): >>><<< 18823 1726855010.32566: done transferring module to remote 18823 1726855010.32579: _low_level_execute_command(): starting 18823 1726855010.32583: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/ /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/AnsiballZ_setup.py && sleep 0' 18823 1726855010.33027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855010.33031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.33041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.33092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855010.33098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855010.33118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.33182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855010.35737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855010.35761: stderr chunk (state=3): >>><<< 18823 1726855010.35764: stdout chunk (state=3): >>><<< 18823 1726855010.35779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855010.35782: _low_level_execute_command(): starting 18823 1726855010.35790: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/AnsiballZ_setup.py && sleep 0' 18823 1726855010.36239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855010.36243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.36245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855010.36247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.36301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855010.36305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.36394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855010.39445: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18823 1726855010.43146: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82918bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/pyt<<< 18823 1726855010.43210: stdout chunk (state=3): >>>hon3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fabe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fabf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fe3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fe3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fc3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fc1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fa9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829003800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829002420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fc2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829000b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829038860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fa82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_s<<< 18823 1726855010.43214: stdout chunk (state=3): >>>truct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829038d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829038bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829038f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fa6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829039610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8290392e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82903a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829050710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829051df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829052c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8290532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8290521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829053d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8290534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82903a540> # /usr/lib64/python3.12/__pycac<<< 18823 1726855010.43220: stdout chunk (state=3): >>>he__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d53c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7c590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d51df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 18823 1726855010.43246: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 18823 1726855010.43283: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18823 1726855010.43310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 18823 1726855010.43326: stdout chunk (state=3): >>> import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7ede0> <<< 18823 1726855010.43394: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7db20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82903ac30><<< 18823 1726855010.43400: stdout chunk (state=3): >>> <<< 18823 1726855010.43432: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 18823 1726855010.43438: stdout chunk (state=3): >>> <<< 18823 1726855010.43548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 18823 1726855010.43557: stdout chunk (state=3): >>> <<< 18823 1726855010.43602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 18823 1726855010.43643: stdout chunk (state=3): >>> import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828dab140> <<< 18823 1726855010.43726: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18823 1726855010.43752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 18823 1726855010.43791: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 18823 1726855010.43798: stdout chunk (state=3): >>> <<< 18823 1726855010.43825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 18823 1726855010.43830: stdout chunk (state=3): >>> <<< 18823 1726855010.43899: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828dcb4d0><<< 18823 1726855010.43928: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18823 1726855010.44002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 18823 1726855010.44051: stdout chunk (state=3): >>> <<< 18823 1726855010.44100: stdout chunk (state=3): >>>import 'ntpath' # <<< 18823 1726855010.44139: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 18823 1726855010.44154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 18823 1726855010.44157: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828e2c200><<< 18823 1726855010.44161: stdout chunk (state=3): >>> <<< 18823 1726855010.44202: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 18823 1726855010.44246: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 18823 1726855010.44254: stdout chunk (state=3): >>> <<< 18823 1726855010.44289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 18823 1726855010.44295: stdout chunk (state=3): >>> <<< 18823 1726855010.44354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 18823 1726855010.44490: stdout chunk (state=3): >>> import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828e2e960> <<< 18823 1726855010.44610: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828e2c320> <<< 18823 1726855010.44671: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828df1250> <<< 18823 1726855010.44711: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 18823 1726855010.44734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 18823 1726855010.44739: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287252e0> <<< 18823 1726855010.44776: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828dca2d0> <<< 18823 1726855010.44799: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7fd40> <<< 18823 1726855010.45100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 18823 1726855010.45132: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff828dca630> <<< 18823 1726855010.45491: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_m8zcw_jc/ansible_setup_payload.zip' <<< 18823 1726855010.45556: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.45726: stdout chunk (state=3): >>># zipimport: zlib available<<< 18823 1726855010.45730: stdout chunk (state=3): >>> <<< 18823 1726855010.45763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 18823 1726855010.45768: stdout chunk (state=3): >>> <<< 18823 1726855010.45791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 18823 1726855010.45797: stdout chunk (state=3): >>> <<< 18823 1726855010.45858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 18823 1726855010.45863: stdout chunk (state=3): >>> <<< 18823 1726855010.46002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 18823 1726855010.46015: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 18823 1726855010.46030: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82878ef90><<< 18823 1726855010.46044: stdout chunk (state=3): >>> import '_typing' # <<< 18823 1726855010.46334: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82876de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82876cfe0><<< 18823 1726855010.46357: stdout chunk (state=3): >>> # zipimport: zlib available<<< 18823 1726855010.46393: stdout chunk (state=3): >>> import 'ansible' # <<< 18823 1726855010.46416: stdout chunk (state=3): >>># zipimport: zlib available<<< 18823 1726855010.46421: stdout chunk (state=3): >>> <<< 18823 1726855010.46440: stdout chunk (state=3): >>># zipimport: zlib available<<< 18823 1726855010.46473: stdout chunk (state=3): >>> # zipimport: zlib available<<< 18823 1726855010.46499: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 18823 1726855010.46523: stdout chunk (state=3): >>># zipimport: zlib available<<< 18823 1726855010.46657: stdout chunk (state=3): >>> <<< 18823 1726855010.48748: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.50638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 18823 1726855010.50654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 18823 1726855010.50662: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82878ce30> <<< 18823 1726855010.50710: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py<<< 18823 1726855010.50714: stdout chunk (state=3): >>> <<< 18823 1726855010.50727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855010.50762: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 18823 1726855010.50772: stdout chunk (state=3): >>> <<< 18823 1726855010.50798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18823 1726855010.50822: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py<<< 18823 1726855010.50829: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 18823 1726855010.50922: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.50938: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8287be960><<< 18823 1726855010.50990: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287be6f0> <<< 18823 1726855010.51031: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287be000><<< 18823 1726855010.51069: stdout chunk (state=3): >>> <<< 18823 1726855010.51083: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 18823 1726855010.51168: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287be4e0><<< 18823 1726855010.51193: stdout chunk (state=3): >>> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82878f9b0> import 'atexit' # <<< 18823 1726855010.51221: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855010.51311: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8287bf6e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.51333: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8287bf920><<< 18823 1726855010.51344: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18823 1726855010.51417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 18823 1726855010.51439: stdout chunk (state=3): >>> import '_locale' # <<< 18823 1726855010.51500: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287bfe30><<< 18823 1726855010.51536: stdout chunk (state=3): >>> import 'pwd' # <<< 18823 1726855010.51552: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18823 1726855010.51579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 18823 1726855010.51643: stdout chunk (state=3): >>> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828629ca0> <<< 18823 1726855010.51713: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.51717: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.51738: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82862b8c0> <<< 18823 1726855010.51782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 18823 1726855010.51831: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862c2c0><<< 18823 1726855010.51866: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18823 1726855010.51909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 18823 1726855010.51959: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862d460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18823 1726855010.52031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 18823 1726855010.52045: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 18823 1726855010.52145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862fef0><<< 18823 1726855010.52156: stdout chunk (state=3): >>> <<< 18823 1726855010.52205: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855010.52259: stdout chunk (state=3): >>> import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828fa6ed0> <<< 18823 1726855010.52263: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862e1b0> <<< 18823 1726855010.52294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 18823 1726855010.52350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 18823 1726855010.52374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 18823 1726855010.52412: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18823 1726855010.52585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 18823 1726855010.52603: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 18823 1726855010.52645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 18823 1726855010.52650: stdout chunk (state=3): >>> <<< 18823 1726855010.52895: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828637f20> import '_tokenize' # <<< 18823 1726855010.52937: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8286369f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828636750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828636cc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82867c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867c350> <<< 18823 1726855010.52976: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18823 1726855010.52989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18823 1726855010.53030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18823 1726855010.53053: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82867ddc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867db80> <<< 18823 1726855010.53088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18823 1726855010.53108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18823 1726855010.53163: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8286802c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867e450> <<< 18823 1726855010.53166: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18823 1726855010.53254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855010.53291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828683aa0> <<< 18823 1726855010.53471: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828680470> <<< 18823 1726855010.53494: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828684830> <<< 18823 1726855010.53511: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828684a70> <<< 18823 1726855010.53693: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8286843e0> <<< 18823 1726855010.53697: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867c500> <<< 18823 1726855010.53726: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8285104d0> <<< 18823 1726855010.53829: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.53862: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828511a90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828686c30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.53960: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828687fb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828686840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18823 1726855010.54010: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.54159: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.54185: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18823 1726855010.54286: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.54413: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.55120: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.56030: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 18823 1726855010.56070: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 18823 1726855010.56112: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 18823 1726855010.56135: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 18823 1726855010.56232: stdout chunk (state=3): >>> # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 18823 1726855010.56245: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828515b80> <<< 18823 1726855010.56403: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 18823 1726855010.56413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 18823 1726855010.56441: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285169f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828511a30> <<< 18823 1726855010.56522: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 18823 1726855010.56566: stdout chunk (state=3): >>> # zipimport: zlib available<<< 18823 1726855010.56582: stdout chunk (state=3): >>> # zipimport: zlib available <<< 18823 1726855010.56611: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 18823 1726855010.56635: stdout chunk (state=3): >>> # zipimport: zlib available<<< 18823 1726855010.56762: stdout chunk (state=3): >>> <<< 18823 1726855010.57154: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 18823 1726855010.57164: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 18823 1726855010.57167: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828516570> <<< 18823 1726855010.57185: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.57702: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58141: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58216: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58288: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18823 1726855010.58307: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58337: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58370: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18823 1726855010.58390: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58449: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58537: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18823 1726855010.58615: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 18823 1726855010.58655: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.58750: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 18823 1726855010.58890: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.59117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18823 1726855010.59185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18823 1726855010.59207: stdout chunk (state=3): >>>import '_ast' # <<< 18823 1726855010.59259: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828517a10> <<< 18823 1726855010.59270: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.59336: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.59426: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 18823 1726855010.59448: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.59542: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 18823 1726855010.59592: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.59632: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.59756: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18823 1726855010.59789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855010.59870: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8285222d0> <<< 18823 1726855010.59905: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82851d1c0> <<< 18823 1726855010.59933: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 18823 1726855010.59966: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60014: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60132: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.60253: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18823 1726855010.60258: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18823 1726855010.60264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18823 1726855010.60356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 18823 1726855010.60359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18823 1726855010.60362: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82860aa80> <<< 18823 1726855010.60404: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287ea750> <<< 18823 1726855010.60490: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8286854f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8286366c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 18823 1726855010.60640: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60644: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60657: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18823 1726855010.60710: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60762: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60785: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60818: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.60848: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.61169: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.61208: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.61232: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.61276: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 18823 1726855010.61285: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.61555: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.61813: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.61868: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.61937: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855010.61994: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 18823 1726855010.62032: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 18823 1726855010.62094: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b2840> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 18823 1726855010.62182: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 18823 1726855010.62220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 18823 1726855010.62237: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828180170> <<< 18823 1726855010.62265: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.62316: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8281804d0> <<< 18823 1726855010.62424: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285a7410> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b3380> <<< 18823 1726855010.62481: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b0f20> <<< 18823 1726855010.62511: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b0a40> <<< 18823 1726855010.62549: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18823 1726855010.62640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc'<<< 18823 1726855010.62661: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 18823 1726855010.62705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 18823 1726855010.62718: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 18823 1726855010.62793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.62806: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828183440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828182cf0><<< 18823 1726855010.62854: stdout chunk (state=3): >>> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.62874: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.62915: stdout chunk (state=3): >>>import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828182ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828182120> <<< 18823 1726855010.62939: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 18823 1726855010.63116: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 18823 1726855010.63157: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828183620> <<< 18823 1726855010.63174: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 18823 1726855010.63224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 18823 1726855010.63286: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.63306: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8281e2150> <<< 18823 1726855010.63347: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8281e0170> <<< 18823 1726855010.63410: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b0c20> <<< 18823 1726855010.63439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 18823 1726855010.63482: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.63504: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.63543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available<<< 18823 1726855010.63635: stdout chunk (state=3): >>> # zipimport: zlib available <<< 18823 1726855010.63896: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 18823 1726855010.63915: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.63949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 18823 1726855010.63964: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.64092: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.64116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.64156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 18823 1726855010.64159: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.64213: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.64272: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.64330: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.64399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 18823 1726855010.64404: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.64968: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.65602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 18823 1726855010.65650: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.65689: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.65765: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.65864: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 18823 1726855010.65912: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.65965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 18823 1726855010.65982: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.66030: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.66121: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 18823 1726855010.66182: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.66214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 18823 1726855010.66398: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.66426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.66543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 18823 1726855010.66598: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8281e2300> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 18823 1726855010.66637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18823 1726855010.66808: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8281e2ed0> <<< 18823 1726855010.66833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 18823 1726855010.66972: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.67085: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 18823 1726855010.67317: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 18823 1726855010.67425: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.67617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 18823 1726855010.67620: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.67638: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.67705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 18823 1726855010.67796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18823 1726855010.67892: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855010.68086: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82821e4b0> <<< 18823 1726855010.68692: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82820e2d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.68743: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 18823 1726855010.68786: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.68829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 18823 1726855010.68849: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.68880: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.68922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 18823 1726855010.69064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828232150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82820f4d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 18823 1726855010.69183: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.69276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.69511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 18823 1726855010.69538: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.69775: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.69986: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.70046: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 18823 1726855010.70290: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # <<< 18823 1726855010.70318: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.70414: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.70505: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.70919: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.71403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 18823 1726855010.71421: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.71511: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.71615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 18823 1726855010.71634: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.71722: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.71819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18823 1726855010.71906: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.71979: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72143: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 18823 1726855010.72167: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 18823 1726855010.72223: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72325: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 18823 1726855010.72338: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72406: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72456: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72665: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 18823 1726855010.72893: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72912: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.72944: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 18823 1726855010.72994: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.73013: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 18823 1726855010.73098: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.73154: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 18823 1726855010.73220: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.73224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 18823 1726855010.73336: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.73343: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 18823 1726855010.73356: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.73404: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.73463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 18823 1726855010.73522: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.73739: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.73983: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 18823 1726855010.74042: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 18823 1726855010.74143: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 18823 1726855010.74353: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 18823 1726855010.74417: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 18823 1726855010.74527: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 18823 1726855010.74581: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 18823 1726855010.74644: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74685: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74799: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74812: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18823 1726855010.74833: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74921: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 18823 1726855010.74941: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.74972: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.75029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 18823 1726855010.75136: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.75225: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.75486: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 18823 1726855010.75495: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.75519: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 18823 1726855010.75562: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.75621: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 18823 1726855010.75704: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.75829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 18823 1726855010.75879: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.75968: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 18823 1726855010.76044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18823 1726855010.76059: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855010.76806: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 18823 1726855010.76809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828032b40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8280306e0> <<< 18823 1726855010.76847: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82821dcd0> <<< 18823 1726855010.77486: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "50", "epoch": "1726855010", "epoch_int": "1726855010", "date": "2024-09-20", "time": "13:56:50", "iso8601_micro": "2024-09-20T17:56:50.760216Z", "iso8601": "2024-09-20T17:56:50Z", "iso8601_basic": "20240920T135650760216", "iso8601_basic_short": "20240920T135650", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2<<< 18823 1726855010.77696: stdout chunk (state=3): >>>f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855010.78137: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy <<< 18823 1726855010.78422: stdout chunk (state=3): >>># destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils <<< 18823 1726855010.78433: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time <<< 18823 1726855010.78436: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual <<< 18823 1726855010.78439: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 18823 1726855010.78942: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 18823 1726855010.79009: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 18823 1726855010.79212: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 18823 1726855010.79216: stdout chunk (state=3): >>># destroy errno # destroy json <<< 18823 1726855010.79269: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 18823 1726855010.79331: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 18823 1726855010.79338: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 18823 1726855010.79341: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 18823 1726855010.79344: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 18823 1726855010.79427: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 18823 1726855010.79509: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18823 1726855010.79703: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 18823 1726855010.79746: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 18823 1726855010.79775: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 18823 1726855010.79837: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 18823 1726855010.79840: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18823 1726855010.80041: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 18823 1726855010.80045: stdout chunk (state=3): >>># destroy _hashlib <<< 18823 1726855010.80058: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18823 1726855010.80507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855010.80544: stderr chunk (state=3): >>><<< 18823 1726855010.80552: stdout chunk (state=3): >>><<< 18823 1726855010.81022: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82918bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8291cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fabe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fabf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fe3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fe3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fc3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fc1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fa9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829003800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829002420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fc2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829000b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829038860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fa82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829038d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829038bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829038f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828fa6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829039610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8290392e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82903a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829050710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829051df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff829052c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8290532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8290521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff829053d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8290534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82903a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d53c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7c590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828d7d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d51df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7ede0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7db20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82903ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828dab140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828dcb4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828e2c200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828e2e960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828e2c320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828df1250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828dca2d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828d7fd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff828dca630> # zipimport: found 103 names in '/tmp/ansible_setup_payload_m8zcw_jc/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82878ef90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82876de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82876cfe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82878ce30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8287be960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287be6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287be000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287be4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82878f9b0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8287bf6e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8287bf920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287bfe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828629ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82862b8c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862c2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862d460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862fef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828fa6ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828637f20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8286369f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828636750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828636cc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82862e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82867c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867c350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82867ddc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867db80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8286802c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867e450> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828683aa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828680470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828684830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828684a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8286843e0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82867c500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8285104d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828511a90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828686c30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828687fb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828686840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828515b80> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285169f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828511a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828516570> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828517a10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8285222d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82851d1c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82860aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8287ea750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8286854f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8286366c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b2840> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828180170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8281804d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285a7410> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b3380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b0f20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b0a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828183440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828182cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828182ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828182120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff828183620> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8281e2150> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8281e0170> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8285b0c20> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8281e2300> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8281e2ed0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff82821e4b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82820e2d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828232150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82820f4d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff828032b40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8280306e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff82821dcd0> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "50", "epoch": "1726855010", "epoch_int": "1726855010", "date": "2024-09-20", "time": "13:56:50", "iso8601_micro": "2024-09-20T17:56:50.760216Z", "iso8601": "2024-09-20T17:56:50Z", "iso8601_basic": "20240920T135650760216", "iso8601_basic_short": "20240920T135650", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18823 1726855010.83004: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855010.83008: _low_level_execute_command(): starting 18823 1726855010.83010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855010.2448149-18923-162511578347995/ > /dev/null 2>&1 && sleep 0' 18823 1726855010.83129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855010.83132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855010.83147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.83150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.83241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855010.83313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855010.83354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.83452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855010.85499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855010.85503: stdout chunk (state=3): >>><<< 18823 1726855010.85505: stderr chunk (state=3): >>><<< 18823 1726855010.85508: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855010.85510: handler run complete 18823 1726855010.85512: variable 'ansible_facts' from source: unknown 18823 1726855010.85655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.85963: variable 'ansible_facts' from source: unknown 18823 1726855010.86072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.86181: attempt loop complete, returning result 18823 1726855010.86184: _execute() done 18823 1726855010.86258: dumping result to json 18823 1726855010.86275: done dumping result, returning 18823 1726855010.86298: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcc66-ac2b-d391-077c-00000000008d] 18823 1726855010.86307: sending task result for task 0affcc66-ac2b-d391-077c-00000000008d ok: [managed_node2] 18823 1726855010.86664: no more pending results, returning what we have 18823 1726855010.86668: results queue empty 18823 1726855010.86669: checking for any_errors_fatal 18823 1726855010.86670: done checking for any_errors_fatal 18823 1726855010.86671: checking for max_fail_percentage 18823 1726855010.86673: done checking for max_fail_percentage 18823 1726855010.86673: checking to see if all hosts have failed and the running result is not ok 18823 1726855010.86674: done checking to see if all hosts have failed 18823 1726855010.86675: getting the remaining hosts for this loop 18823 1726855010.86676: done getting the remaining hosts for this loop 18823 1726855010.86680: getting the next task for host managed_node2 18823 1726855010.86693: done getting next task for host managed_node2 18823 1726855010.86699: ^ task is: TASK: Check if system is ostree 18823 1726855010.86702: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855010.86706: getting variables 18823 1726855010.86708: in VariableManager get_vars() 18823 1726855010.86739: Calling all_inventory to load vars for managed_node2 18823 1726855010.86742: Calling groups_inventory to load vars for managed_node2 18823 1726855010.86746: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855010.86759: Calling all_plugins_play to load vars for managed_node2 18823 1726855010.86763: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855010.86766: Calling groups_plugins_play to load vars for managed_node2 18823 1726855010.87376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855010.87734: done with get_vars() 18823 1726855010.87746: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:56:50 -0400 (0:00:00.706) 0:00:02.531 ****** 18823 1726855010.87969: entering _queue_task() for managed_node2/stat 18823 1726855010.88676: done sending task result for task 0affcc66-ac2b-d391-077c-00000000008d 18823 1726855010.88710: WORKER PROCESS EXITING 18823 1726855010.88700: worker is 1 (out of 1 available) 18823 1726855010.88718: exiting _queue_task() for managed_node2/stat 18823 1726855010.88727: done queuing things up, now waiting for results queue to drain 18823 1726855010.88729: waiting for pending results... 18823 1726855010.88933: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 18823 1726855010.89033: in run() - task 0affcc66-ac2b-d391-077c-00000000008f 18823 1726855010.89044: variable 'ansible_search_path' from source: unknown 18823 1726855010.89047: variable 'ansible_search_path' from source: unknown 18823 1726855010.89080: calling self._execute() 18823 1726855010.89361: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855010.89365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855010.89378: variable 'omit' from source: magic vars 18823 1726855010.90415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855010.90869: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855010.90949: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855010.91092: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855010.91157: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855010.91242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855010.91390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855010.91424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855010.91500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855010.91902: Evaluated conditional (not __network_is_ostree is defined): True 18823 1726855010.91906: variable 'omit' from source: magic vars 18823 1726855010.91909: variable 'omit' from source: magic vars 18823 1726855010.91911: variable 'omit' from source: magic vars 18823 1726855010.92026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855010.92058: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855010.92138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855010.92161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855010.92178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855010.92259: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855010.92269: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855010.92277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855010.92554: Set connection var ansible_timeout to 10 18823 1726855010.92662: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855010.92665: Set connection var ansible_shell_type to sh 18823 1726855010.92667: Set connection var ansible_shell_executable to /bin/sh 18823 1726855010.92669: Set connection var ansible_connection to ssh 18823 1726855010.92671: Set connection var ansible_pipelining to False 18823 1726855010.92674: variable 'ansible_shell_executable' from source: unknown 18823 1726855010.92676: variable 'ansible_connection' from source: unknown 18823 1726855010.92678: variable 'ansible_module_compression' from source: unknown 18823 1726855010.92680: variable 'ansible_shell_type' from source: unknown 18823 1726855010.92681: variable 'ansible_shell_executable' from source: unknown 18823 1726855010.92684: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855010.92686: variable 'ansible_pipelining' from source: unknown 18823 1726855010.92690: variable 'ansible_timeout' from source: unknown 18823 1726855010.92692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855010.92997: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855010.93014: variable 'omit' from source: magic vars 18823 1726855010.93028: starting attempt loop 18823 1726855010.93034: running the handler 18823 1726855010.93052: _low_level_execute_command(): starting 18823 1726855010.93106: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855010.93870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855010.93940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855010.93968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.94079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855010.96197: stdout chunk (state=3): >>>/root <<< 18823 1726855010.96275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855010.96290: stdout chunk (state=3): >>><<< 18823 1726855010.96309: stderr chunk (state=3): >>><<< 18823 1726855010.96419: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855010.96435: _low_level_execute_command(): starting 18823 1726855010.96438: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330 `" && echo ansible-tmp-1726855010.9632843-18949-148519160361330="` echo /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330 `" ) && sleep 0' 18823 1726855010.97057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855010.97060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855010.97062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855010.97080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855010.97100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855010.97114: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855010.97204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855010.97305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855010.97505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855011.00026: stdout chunk (state=3): >>>ansible-tmp-1726855010.9632843-18949-148519160361330=/root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330 <<< 18823 1726855011.00383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855011.00388: stderr chunk (state=3): >>><<< 18823 1726855011.00391: stdout chunk (state=3): >>><<< 18823 1726855011.00413: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855010.9632843-18949-148519160361330=/root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855011.00631: variable 'ansible_module_compression' from source: unknown 18823 1726855011.00647: ANSIBALLZ: Using lock for stat 18823 1726855011.00654: ANSIBALLZ: Acquiring lock 18823 1726855011.00661: ANSIBALLZ: Lock acquired: 140142268013392 18823 1726855011.00668: ANSIBALLZ: Creating module 18823 1726855011.20757: ANSIBALLZ: Writing module into payload 18823 1726855011.20868: ANSIBALLZ: Writing module 18823 1726855011.20898: ANSIBALLZ: Renaming module 18823 1726855011.20911: ANSIBALLZ: Done creating module 18823 1726855011.20933: variable 'ansible_facts' from source: unknown 18823 1726855011.21020: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/AnsiballZ_stat.py 18823 1726855011.21208: Sending initial data 18823 1726855011.21211: Sent initial data (153 bytes) 18823 1726855011.21802: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855011.21848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.21863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855011.21956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855011.21969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855011.21984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.22106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855011.24407: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855011.24483: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855011.24577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp9c8a4hsx /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/AnsiballZ_stat.py <<< 18823 1726855011.24581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/AnsiballZ_stat.py" <<< 18823 1726855011.24690: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp9c8a4hsx" to remote "/root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/AnsiballZ_stat.py" <<< 18823 1726855011.25532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855011.25696: stderr chunk (state=3): >>><<< 18823 1726855011.25699: stdout chunk (state=3): >>><<< 18823 1726855011.25706: done transferring module to remote 18823 1726855011.25708: _low_level_execute_command(): starting 18823 1726855011.25710: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/ /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/AnsiballZ_stat.py && sleep 0' 18823 1726855011.26337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855011.26341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855011.26344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.26406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.26465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855011.26477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.26564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855011.29068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855011.29132: stderr chunk (state=3): >>><<< 18823 1726855011.29135: stdout chunk (state=3): >>><<< 18823 1726855011.29138: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855011.29140: _low_level_execute_command(): starting 18823 1726855011.29142: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/AnsiballZ_stat.py && sleep 0' 18823 1726855011.29546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.29550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.29552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.29554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.29594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855011.29605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.29684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855011.32880: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 18823 1726855011.32890: stdout chunk (state=3): >>>import '_io' # <<< 18823 1726855011.32896: stdout chunk (state=3): >>>import 'marshal' # <<< 18823 1726855011.32942: stdout chunk (state=3): >>>import 'posix' # <<< 18823 1726855011.32990: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 18823 1726855011.32996: stdout chunk (state=3): >>># installing zipimport hook <<< 18823 1726855011.33055: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 18823 1726855011.33091: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855011.33124: stdout chunk (state=3): >>>import '_codecs' # <<< 18823 1726855011.33154: stdout chunk (state=3): >>>import 'codecs' # <<< 18823 1726855011.33266: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828adfb30> <<< 18823 1726855011.33269: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 18823 1726855011.33291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828b12a50> <<< 18823 1726855011.33317: stdout chunk (state=3): >>>import '_signal' # <<< 18823 1726855011.33346: stdout chunk (state=3): >>>import '_abc' # <<< 18823 1726855011.33369: stdout chunk (state=3): >>>import 'abc' # <<< 18823 1726855011.33378: stdout chunk (state=3): >>>import 'io' # <<< 18823 1726855011.33422: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 18823 1726855011.33554: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18823 1726855011.33585: stdout chunk (state=3): >>>import 'genericpath' # <<< 18823 1726855011.33591: stdout chunk (state=3): >>>import 'posixpath' # <<< 18823 1726855011.33659: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 18823 1726855011.33704: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 18823 1726855011.33708: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 18823 1726855011.33890: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288c1fa0> <<< 18823 1726855011.33906: stdout chunk (state=3): >>>import 'site' # <<< 18823 1726855011.33929: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18823 1726855011.34319: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18823 1726855011.34327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18823 1726855011.34364: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 18823 1726855011.34372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855011.34381: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18823 1726855011.34561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 18823 1726855011.34564: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288ffe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fff50><<< 18823 1726855011.34583: stdout chunk (state=3): >>> <<< 18823 1726855011.34609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18823 1726855011.34635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18823 1726855011.34660: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18823 1726855011.34728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855011.34749: stdout chunk (state=3): >>>import 'itertools' # <<< 18823 1726855011.34961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828937890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828937f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828917b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828915280> <<< 18823 1726855011.35045: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fd040> <<< 18823 1726855011.35079: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 18823 1726855011.35100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18823 1726855011.35118: stdout chunk (state=3): >>>import '_sre' # <<< 18823 1726855011.35144: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18823 1726855011.35173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 18823 1726855011.35201: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 18823 1726855011.35204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18823 1726855011.35248: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828957800> <<< 18823 1726855011.35265: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828956420> <<< 18823 1726855011.35297: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828916150> <<< 18823 1726855011.35314: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828954c80> <<< 18823 1726855011.35377: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18823 1726855011.35392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898c890> <<< 18823 1726855011.35400: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fc2c0> <<< 18823 1726855011.35427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 18823 1726855011.35432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 18823 1726855011.35473: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.35480: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.35483: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382898cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898cbf0> <<< 18823 1726855011.35518: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.35522: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.35656: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382898cfe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898d6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898d3a0> import 'importlib.machinery' # <<< 18823 1726855011.35691: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 18823 1726855011.35717: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898e5d0> <<< 18823 1726855011.35735: stdout chunk (state=3): >>>import 'importlib.util' # <<< 18823 1726855011.35743: stdout chunk (state=3): >>>import 'runpy' # <<< 18823 1726855011.35773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 18823 1726855011.35816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18823 1726855011.35842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 18823 1726855011.35852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 18823 1726855011.35857: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a47a0> <<< 18823 1726855011.35877: stdout chunk (state=3): >>>import 'errno' # <<< 18823 1726855011.35915: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.35919: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38289a5eb0> <<< 18823 1726855011.35951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 18823 1726855011.36056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38289a7380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a62a0> <<< 18823 1726855011.36083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 18823 1726855011.36089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 18823 1726855011.36147: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.36152: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.36159: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38289a7e00> <<< 18823 1726855011.36165: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a7530> <<< 18823 1726855011.36229: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898e570> <<< 18823 1726855011.36245: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 18823 1726855011.36283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 18823 1726855011.36304: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18823 1726855011.36365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18823 1726855011.36370: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382872fce0> <<< 18823 1726855011.36400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 18823 1726855011.36406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 18823 1726855011.36558: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828758740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287584a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828758770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18823 1726855011.36588: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.36774: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38287590a0> <<< 18823 1726855011.36933: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.36940: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828759a60> <<< 18823 1726855011.36952: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828758950> <<< 18823 1726855011.36982: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382872de80> <<< 18823 1726855011.37008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18823 1726855011.37039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18823 1726855011.37067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18823 1726855011.37086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 18823 1726855011.37105: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382875ae10> <<< 18823 1726855011.37124: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287598e0> <<< 18823 1726855011.37156: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898ecc0> <<< 18823 1726855011.37182: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18823 1726855011.37262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855011.37359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18823 1726855011.37362: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828783170> <<< 18823 1726855011.37432: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18823 1726855011.37449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855011.37468: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 18823 1726855011.37502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18823 1726855011.37553: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287a74d0> <<< 18823 1726855011.37579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18823 1726855011.37636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18823 1726855011.37711: stdout chunk (state=3): >>>import 'ntpath' # <<< 18823 1726855011.37757: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288082f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18823 1726855011.37792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18823 1726855011.37957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18823 1726855011.37995: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382880aa20> <<< 18823 1726855011.38098: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288083e0> <<< 18823 1726855011.38144: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287cd2e0> <<< 18823 1726855011.38180: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 18823 1726855011.38183: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281253d0> <<< 18823 1726855011.38216: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287a6300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382875bd40> <<< 18823 1726855011.38391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 18823 1726855011.38399: stdout chunk (state=3): >>> <<< 18823 1726855011.38418: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f38287a6660> <<< 18823 1726855011.38820: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_v8yu0r_w/ansible_stat_payload.zip' # zipimport: zlib available <<< 18823 1726855011.39046: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.39074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 18823 1726855011.39099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18823 1726855011.39150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18823 1726855011.39460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382817b0e0> import '_typing' # <<< 18823 1726855011.39534: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828159fd0> <<< 18823 1726855011.39546: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828159160> <<< 18823 1726855011.39549: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.39598: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 18823 1726855011.39620: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.39636: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.39649: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 18823 1726855011.39670: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.41806: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.43544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 18823 1726855011.43549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828179760> <<< 18823 1726855011.43590: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 18823 1726855011.43597: stdout chunk (state=3): >>> <<< 18823 1726855011.43619: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 18823 1726855011.43638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18823 1726855011.43664: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 18823 1726855011.43670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 18823 1726855011.43710: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38281a2ab0> <<< 18823 1726855011.43766: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a2870> <<< 18823 1726855011.43805: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a2180> <<< 18823 1726855011.43829: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 18823 1726855011.43832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 18823 1726855011.43906: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828b129c0> import 'atexit' # <<< 18823 1726855011.44012: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38281a3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38281a39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18823 1726855011.44034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 18823 1726855011.44052: stdout chunk (state=3): >>>import '_locale' # <<< 18823 1726855011.44120: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a3f20> <<< 18823 1726855011.44123: stdout chunk (state=3): >>>import 'pwd' # <<< 18823 1726855011.44155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18823 1726855011.44184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18823 1726855011.44231: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382800dbb0> <<< 18823 1726855011.44261: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.44265: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.44361: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382800f860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828010260> <<< 18823 1726855011.44380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18823 1726855011.44414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 18823 1726855011.44439: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828011130> <<< 18823 1726855011.44461: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18823 1726855011.44513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 18823 1726855011.44537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 18823 1726855011.44543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18823 1726855011.44615: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828013e90> <<< 18823 1726855011.44661: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382815b0b0> <<< 18823 1726855011.44690: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828012150> <<< 18823 1726855011.44874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 18823 1726855011.44878: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801bd10> <<< 18823 1726855011.44880: stdout chunk (state=3): >>>import '_tokenize' # <<< 18823 1726855011.44974: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801a7e0> <<< 18823 1726855011.44983: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801a540> <<< 18823 1726855011.45000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 18823 1726855011.45013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18823 1726855011.45120: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801aa50> <<< 18823 1726855011.45161: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828012660> <<< 18823 1726855011.45193: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.45199: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828063ec0> <<< 18823 1726855011.45224: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 18823 1726855011.45229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828063fb0> <<< 18823 1726855011.45258: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18823 1726855011.45283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18823 1726855011.45306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18823 1726855011.45354: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.45357: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828065ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828065880> <<< 18823 1726855011.45389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18823 1726855011.45553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18823 1726855011.45595: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.45613: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828067fb0> <<< 18823 1726855011.45755: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828066180> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 18823 1726855011.45774: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806b710> <<< 18823 1726855011.45954: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280680e0> <<< 18823 1726855011.46025: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806c4d0> <<< 18823 1726855011.46072: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806c500> <<< 18823 1726855011.46124: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806ca10> <<< 18823 1726855011.46151: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280641a0> <<< 18823 1726855011.46176: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 18823 1726855011.46217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 18823 1726855011.46260: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.46290: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.46296: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38280f81d0> <<< 18823 1726855011.46519: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.46544: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38280f94c0> <<< 18823 1726855011.46551: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806e960> <<< 18823 1726855011.46591: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806fd10><<< 18823 1726855011.46600: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806e570> <<< 18823 1726855011.46656: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18823 1726855011.46777: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.46914: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.46919: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 18823 1726855011.46967: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 18823 1726855011.47053: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.47161: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.47339: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.48190: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.49062: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 18823 1726855011.49077: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 18823 1726855011.49083: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 18823 1726855011.49109: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 18823 1726855011.49258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38280fd670> <<< 18823 1726855011.49297: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 18823 1726855011.49301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 18823 1726855011.49322: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280fe450> <<< 18823 1726855011.49339: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806d820> <<< 18823 1726855011.49404: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 18823 1726855011.49411: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.49430: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.49465: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 18823 1726855011.49469: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.49685: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.49910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 18823 1726855011.49922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 18823 1726855011.49925: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280fe4e0> <<< 18823 1726855011.49945: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.50662: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.51396: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.51505: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.51662: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 18823 1726855011.51671: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.51724: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18823 1726855011.51727: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.51832: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.51946: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18823 1726855011.52057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855011.52080: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 18823 1726855011.52101: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.52443: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.52808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18823 1726855011.52885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18823 1726855011.52906: stdout chunk (state=3): >>>import '_ast' # <<< 18823 1726855011.53000: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280ff5f0> <<< 18823 1726855011.53015: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.53116: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.53224: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 18823 1726855011.53460: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 18823 1726855011.53510: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.53600: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.53690: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18823 1726855011.53748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855011.53853: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 18823 1726855011.53859: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3827f0a120> <<< 18823 1726855011.53911: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3827f07e60> <<< 18823 1726855011.53941: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 18823 1726855011.53961: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.54045: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.54140: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.54172: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.54229: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18823 1726855011.54258: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18823 1726855011.54283: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18823 1726855011.54314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18823 1726855011.54384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18823 1726855011.54410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 18823 1726855011.54428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18823 1726855011.54514: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281fe960> <<< 18823 1726855011.54573: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281ea630> <<< 18823 1726855011.54689: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3827f0a0f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828159f40> <<< 18823 1726855011.54702: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 18823 1726855011.54711: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.54754: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.54786: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18823 1726855011.54864: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 18823 1726855011.54957: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18823 1726855011.55120: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.55425: stdout chunk (state=3): >>># zipimport: zlib available <<< 18823 1726855011.55841: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 18823 1726855011.56123: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv <<< 18823 1726855011.56127: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 18823 1726855011.56164: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr <<< 18823 1726855011.56191: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 18823 1726855011.56195: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator<<< 18823 1726855011.56200: stdout chunk (state=3): >>> <<< 18823 1726855011.56402: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 18823 1726855011.56413: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info <<< 18823 1726855011.56416: stdout chunk (state=3): >>># destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 18823 1726855011.56738: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 18823 1726855011.56743: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 18823 1726855011.56776: stdout chunk (state=3): >>># destroy _bz2 <<< 18823 1726855011.56781: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 <<< 18823 1726855011.56797: stdout chunk (state=3): >>># destroy binascii <<< 18823 1726855011.56821: stdout chunk (state=3): >>># destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 18823 1726855011.56825: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib <<< 18823 1726855011.56831: stdout chunk (state=3): >>># destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress<<< 18823 1726855011.56837: stdout chunk (state=3): >>> <<< 18823 1726855011.57080: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 18823 1726855011.57092: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 18823 1726855011.57142: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 18823 1726855011.57146: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 18823 1726855011.57158: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix <<< 18823 1726855011.57190: stdout chunk (state=3): >>># destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 18823 1726855011.57201: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools<<< 18823 1726855011.57211: stdout chunk (state=3): >>> # cleanup[3] wiping _functools # cleanup[3] wiping collections<<< 18823 1726855011.57237: stdout chunk (state=3): >>> # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 18823 1726855011.57255: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat <<< 18823 1726855011.57271: stdout chunk (state=3): >>># cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 18823 1726855011.57284: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 18823 1726855011.57291: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 18823 1726855011.57364: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18823 1726855011.57488: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 18823 1726855011.57514: stdout chunk (state=3): >>># destroy _collections <<< 18823 1726855011.57542: stdout chunk (state=3): >>># destroy platform <<< 18823 1726855011.57548: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 18823 1726855011.57577: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 18823 1726855011.57763: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 18823 1726855011.57782: stdout chunk (state=3): >>># destroy time <<< 18823 1726855011.57808: stdout chunk (state=3): >>># destroy _random <<< 18823 1726855011.57815: stdout chunk (state=3): >>># destroy _weakref <<< 18823 1726855011.57836: stdout chunk (state=3): >>># destroy _hashlib <<< 18823 1726855011.57845: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re <<< 18823 1726855011.57886: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 18823 1726855011.57891: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools<<< 18823 1726855011.57902: stdout chunk (state=3): >>> # destroy builtins # destroy _thread <<< 18823 1726855011.57913: stdout chunk (state=3): >>># clear sys.audit hooks <<< 18823 1726855011.58343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855011.58372: stderr chunk (state=3): >>><<< 18823 1726855011.58375: stdout chunk (state=3): >>><<< 18823 1726855011.58439: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828b12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288ffe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828937890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828937f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828917b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828915280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fd040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828957800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828956420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828916150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828954c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fc2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382898cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898cbf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382898cfe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288fade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898d6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898d3a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898e5d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38289a5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38289a7380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38289a7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38289a7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898e570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382872fce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828758740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287584a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828758770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38287590a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828759a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828758950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382872de80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382875ae10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287598e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382898ecc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828783170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287a74d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288082f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382880aa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38288083e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287cd2e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38287a6300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382875bd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f38287a6660> # zipimport: found 30 names in '/tmp/ansible_stat_payload_v8yu0r_w/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382817b0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828159fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828159160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828179760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38281a2ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a2870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a2180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828b129c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38281a3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38281a39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281a3f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382800dbb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382800f860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828010260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828011130> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828013e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382815b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828012150> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801bd10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801a7e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801a540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382801aa50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828012660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828063ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828063fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828065ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828065880> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3828067fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828066180> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806b710> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280680e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806c4d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806c500> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806ca10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38280f81d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38280f94c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806e960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382806fd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806e570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38280fd670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280fe450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382806d820> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280fe4e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38280ff5f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3827f0a120> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3827f07e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281fe960> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38281ea630> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3827f0a0f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3828159f40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18823 1726855011.58970: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855011.58973: _low_level_execute_command(): starting 18823 1726855011.58975: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855010.9632843-18949-148519160361330/ > /dev/null 2>&1 && sleep 0' 18823 1726855011.59123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855011.59128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855011.59131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.59133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855011.59135: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.59137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.59195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855011.59201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855011.59203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.59274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855011.61845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855011.61871: stderr chunk (state=3): >>><<< 18823 1726855011.61874: stdout chunk (state=3): >>><<< 18823 1726855011.61891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855011.61899: handler run complete 18823 1726855011.61917: attempt loop complete, returning result 18823 1726855011.61920: _execute() done 18823 1726855011.61922: dumping result to json 18823 1726855011.61926: done dumping result, returning 18823 1726855011.61933: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affcc66-ac2b-d391-077c-00000000008f] 18823 1726855011.61936: sending task result for task 0affcc66-ac2b-d391-077c-00000000008f 18823 1726855011.62023: done sending task result for task 0affcc66-ac2b-d391-077c-00000000008f 18823 1726855011.62026: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 18823 1726855011.62080: no more pending results, returning what we have 18823 1726855011.62082: results queue empty 18823 1726855011.62083: checking for any_errors_fatal 18823 1726855011.62092: done checking for any_errors_fatal 18823 1726855011.62092: checking for max_fail_percentage 18823 1726855011.62097: done checking for max_fail_percentage 18823 1726855011.62097: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.62098: done checking to see if all hosts have failed 18823 1726855011.62099: getting the remaining hosts for this loop 18823 1726855011.62100: done getting the remaining hosts for this loop 18823 1726855011.62103: getting the next task for host managed_node2 18823 1726855011.62109: done getting next task for host managed_node2 18823 1726855011.62111: ^ task is: TASK: Set flag to indicate system is ostree 18823 1726855011.62114: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.62117: getting variables 18823 1726855011.62119: in VariableManager get_vars() 18823 1726855011.62146: Calling all_inventory to load vars for managed_node2 18823 1726855011.62149: Calling groups_inventory to load vars for managed_node2 18823 1726855011.62152: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.62162: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.62165: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.62167: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.62334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.62451: done with get_vars() 18823 1726855011.62459: done getting variables 18823 1726855011.62533: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:56:51 -0400 (0:00:00.745) 0:00:03.277 ****** 18823 1726855011.62554: entering _queue_task() for managed_node2/set_fact 18823 1726855011.62555: Creating lock for set_fact 18823 1726855011.62762: worker is 1 (out of 1 available) 18823 1726855011.62775: exiting _queue_task() for managed_node2/set_fact 18823 1726855011.62786: done queuing things up, now waiting for results queue to drain 18823 1726855011.62790: waiting for pending results... 18823 1726855011.62928: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 18823 1726855011.62985: in run() - task 0affcc66-ac2b-d391-077c-000000000090 18823 1726855011.62997: variable 'ansible_search_path' from source: unknown 18823 1726855011.63001: variable 'ansible_search_path' from source: unknown 18823 1726855011.63035: calling self._execute() 18823 1726855011.63086: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.63092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.63103: variable 'omit' from source: magic vars 18823 1726855011.63493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855011.63658: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855011.63694: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855011.63720: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855011.63744: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855011.63811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855011.63828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855011.63845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855011.63862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855011.63952: Evaluated conditional (not __network_is_ostree is defined): True 18823 1726855011.63955: variable 'omit' from source: magic vars 18823 1726855011.63981: variable 'omit' from source: magic vars 18823 1726855011.64063: variable '__ostree_booted_stat' from source: set_fact 18823 1726855011.64105: variable 'omit' from source: magic vars 18823 1726855011.64125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855011.64145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855011.64159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855011.64172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855011.64180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855011.64208: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855011.64211: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.64215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.64276: Set connection var ansible_timeout to 10 18823 1726855011.64280: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855011.64283: Set connection var ansible_shell_type to sh 18823 1726855011.64291: Set connection var ansible_shell_executable to /bin/sh 18823 1726855011.64296: Set connection var ansible_connection to ssh 18823 1726855011.64303: Set connection var ansible_pipelining to False 18823 1726855011.64321: variable 'ansible_shell_executable' from source: unknown 18823 1726855011.64334: variable 'ansible_connection' from source: unknown 18823 1726855011.64337: variable 'ansible_module_compression' from source: unknown 18823 1726855011.64339: variable 'ansible_shell_type' from source: unknown 18823 1726855011.64341: variable 'ansible_shell_executable' from source: unknown 18823 1726855011.64343: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.64345: variable 'ansible_pipelining' from source: unknown 18823 1726855011.64347: variable 'ansible_timeout' from source: unknown 18823 1726855011.64349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.64413: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855011.64421: variable 'omit' from source: magic vars 18823 1726855011.64427: starting attempt loop 18823 1726855011.64429: running the handler 18823 1726855011.64443: handler run complete 18823 1726855011.64450: attempt loop complete, returning result 18823 1726855011.64452: _execute() done 18823 1726855011.64455: dumping result to json 18823 1726855011.64459: done dumping result, returning 18823 1726855011.64465: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affcc66-ac2b-d391-077c-000000000090] 18823 1726855011.64467: sending task result for task 0affcc66-ac2b-d391-077c-000000000090 18823 1726855011.64538: done sending task result for task 0affcc66-ac2b-d391-077c-000000000090 18823 1726855011.64546: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 18823 1726855011.64597: no more pending results, returning what we have 18823 1726855011.64600: results queue empty 18823 1726855011.64601: checking for any_errors_fatal 18823 1726855011.64608: done checking for any_errors_fatal 18823 1726855011.64609: checking for max_fail_percentage 18823 1726855011.64611: done checking for max_fail_percentage 18823 1726855011.64612: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.64612: done checking to see if all hosts have failed 18823 1726855011.64613: getting the remaining hosts for this loop 18823 1726855011.64615: done getting the remaining hosts for this loop 18823 1726855011.64618: getting the next task for host managed_node2 18823 1726855011.64625: done getting next task for host managed_node2 18823 1726855011.64628: ^ task is: TASK: Fix CentOS6 Base repo 18823 1726855011.64630: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.64634: getting variables 18823 1726855011.64635: in VariableManager get_vars() 18823 1726855011.64659: Calling all_inventory to load vars for managed_node2 18823 1726855011.64662: Calling groups_inventory to load vars for managed_node2 18823 1726855011.64664: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.64672: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.64675: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.64683: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.64833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.64947: done with get_vars() 18823 1726855011.64955: done getting variables 18823 1726855011.65039: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:56:51 -0400 (0:00:00.025) 0:00:03.302 ****** 18823 1726855011.65057: entering _queue_task() for managed_node2/copy 18823 1726855011.65245: worker is 1 (out of 1 available) 18823 1726855011.65258: exiting _queue_task() for managed_node2/copy 18823 1726855011.65268: done queuing things up, now waiting for results queue to drain 18823 1726855011.65269: waiting for pending results... 18823 1726855011.65413: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 18823 1726855011.65464: in run() - task 0affcc66-ac2b-d391-077c-000000000092 18823 1726855011.65473: variable 'ansible_search_path' from source: unknown 18823 1726855011.65476: variable 'ansible_search_path' from source: unknown 18823 1726855011.65509: calling self._execute() 18823 1726855011.65557: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.65561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.65571: variable 'omit' from source: magic vars 18823 1726855011.65898: variable 'ansible_distribution' from source: facts 18823 1726855011.65912: Evaluated conditional (ansible_distribution == 'CentOS'): True 18823 1726855011.65999: variable 'ansible_distribution_major_version' from source: facts 18823 1726855011.66003: Evaluated conditional (ansible_distribution_major_version == '6'): False 18823 1726855011.66005: when evaluation is False, skipping this task 18823 1726855011.66008: _execute() done 18823 1726855011.66010: dumping result to json 18823 1726855011.66013: done dumping result, returning 18823 1726855011.66020: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0affcc66-ac2b-d391-077c-000000000092] 18823 1726855011.66022: sending task result for task 0affcc66-ac2b-d391-077c-000000000092 18823 1726855011.66116: done sending task result for task 0affcc66-ac2b-d391-077c-000000000092 18823 1726855011.66119: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18823 1726855011.66191: no more pending results, returning what we have 18823 1726855011.66194: results queue empty 18823 1726855011.66205: checking for any_errors_fatal 18823 1726855011.66208: done checking for any_errors_fatal 18823 1726855011.66209: checking for max_fail_percentage 18823 1726855011.66211: done checking for max_fail_percentage 18823 1726855011.66212: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.66212: done checking to see if all hosts have failed 18823 1726855011.66213: getting the remaining hosts for this loop 18823 1726855011.66214: done getting the remaining hosts for this loop 18823 1726855011.66217: getting the next task for host managed_node2 18823 1726855011.66222: done getting next task for host managed_node2 18823 1726855011.66224: ^ task is: TASK: Include the task 'enable_epel.yml' 18823 1726855011.66226: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.66230: getting variables 18823 1726855011.66231: in VariableManager get_vars() 18823 1726855011.66248: Calling all_inventory to load vars for managed_node2 18823 1726855011.66250: Calling groups_inventory to load vars for managed_node2 18823 1726855011.66252: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.66258: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.66260: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.66261: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.66368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.66502: done with get_vars() 18823 1726855011.66508: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:56:51 -0400 (0:00:00.015) 0:00:03.317 ****** 18823 1726855011.66569: entering _queue_task() for managed_node2/include_tasks 18823 1726855011.66736: worker is 1 (out of 1 available) 18823 1726855011.66747: exiting _queue_task() for managed_node2/include_tasks 18823 1726855011.66756: done queuing things up, now waiting for results queue to drain 18823 1726855011.66757: waiting for pending results... 18823 1726855011.66899: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 18823 1726855011.66961: in run() - task 0affcc66-ac2b-d391-077c-000000000093 18823 1726855011.66970: variable 'ansible_search_path' from source: unknown 18823 1726855011.66973: variable 'ansible_search_path' from source: unknown 18823 1726855011.67006: calling self._execute() 18823 1726855011.67054: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.67058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.67068: variable 'omit' from source: magic vars 18823 1726855011.67397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855011.68824: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855011.68872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855011.68902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855011.68929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855011.68951: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855011.69011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855011.69031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855011.69049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855011.69079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855011.69093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855011.69173: variable '__network_is_ostree' from source: set_fact 18823 1726855011.69186: Evaluated conditional (not __network_is_ostree | d(false)): True 18823 1726855011.69191: _execute() done 18823 1726855011.69196: dumping result to json 18823 1726855011.69202: done dumping result, returning 18823 1726855011.69208: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0affcc66-ac2b-d391-077c-000000000093] 18823 1726855011.69212: sending task result for task 0affcc66-ac2b-d391-077c-000000000093 18823 1726855011.69292: done sending task result for task 0affcc66-ac2b-d391-077c-000000000093 18823 1726855011.69295: WORKER PROCESS EXITING 18823 1726855011.69325: no more pending results, returning what we have 18823 1726855011.69330: in VariableManager get_vars() 18823 1726855011.69359: Calling all_inventory to load vars for managed_node2 18823 1726855011.69361: Calling groups_inventory to load vars for managed_node2 18823 1726855011.69364: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.69374: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.69376: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.69379: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.69520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.69631: done with get_vars() 18823 1726855011.69636: variable 'ansible_search_path' from source: unknown 18823 1726855011.69637: variable 'ansible_search_path' from source: unknown 18823 1726855011.69661: we have included files to process 18823 1726855011.69661: generating all_blocks data 18823 1726855011.69663: done generating all_blocks data 18823 1726855011.69666: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18823 1726855011.69667: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18823 1726855011.69669: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18823 1726855011.70141: done processing included file 18823 1726855011.70143: iterating over new_blocks loaded from include file 18823 1726855011.70144: in VariableManager get_vars() 18823 1726855011.70151: done with get_vars() 18823 1726855011.70152: filtering new block on tags 18823 1726855011.70165: done filtering new block on tags 18823 1726855011.70167: in VariableManager get_vars() 18823 1726855011.70172: done with get_vars() 18823 1726855011.70173: filtering new block on tags 18823 1726855011.70179: done filtering new block on tags 18823 1726855011.70180: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 18823 1726855011.70183: extending task lists for all hosts with included blocks 18823 1726855011.70246: done extending task lists 18823 1726855011.70247: done processing included files 18823 1726855011.70248: results queue empty 18823 1726855011.70248: checking for any_errors_fatal 18823 1726855011.70250: done checking for any_errors_fatal 18823 1726855011.70250: checking for max_fail_percentage 18823 1726855011.70251: done checking for max_fail_percentage 18823 1726855011.70252: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.70252: done checking to see if all hosts have failed 18823 1726855011.70253: getting the remaining hosts for this loop 18823 1726855011.70253: done getting the remaining hosts for this loop 18823 1726855011.70255: getting the next task for host managed_node2 18823 1726855011.70257: done getting next task for host managed_node2 18823 1726855011.70258: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 18823 1726855011.70260: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.70261: getting variables 18823 1726855011.70262: in VariableManager get_vars() 18823 1726855011.70267: Calling all_inventory to load vars for managed_node2 18823 1726855011.70269: Calling groups_inventory to load vars for managed_node2 18823 1726855011.70270: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.70273: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.70278: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.70280: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.70375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.70482: done with get_vars() 18823 1726855011.70490: done getting variables 18823 1726855011.70536: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18823 1726855011.70666: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:56:51 -0400 (0:00:00.041) 0:00:03.358 ****** 18823 1726855011.70701: entering _queue_task() for managed_node2/command 18823 1726855011.70703: Creating lock for command 18823 1726855011.70910: worker is 1 (out of 1 available) 18823 1726855011.70922: exiting _queue_task() for managed_node2/command 18823 1726855011.70932: done queuing things up, now waiting for results queue to drain 18823 1726855011.70933: waiting for pending results... 18823 1726855011.71273: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 18823 1726855011.71278: in run() - task 0affcc66-ac2b-d391-077c-0000000000ad 18823 1726855011.71291: variable 'ansible_search_path' from source: unknown 18823 1726855011.71295: variable 'ansible_search_path' from source: unknown 18823 1726855011.71337: calling self._execute() 18823 1726855011.71414: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.71420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.71431: variable 'omit' from source: magic vars 18823 1726855011.71809: variable 'ansible_distribution' from source: facts 18823 1726855011.71817: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18823 1726855011.71950: variable 'ansible_distribution_major_version' from source: facts 18823 1726855011.71963: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18823 1726855011.71967: when evaluation is False, skipping this task 18823 1726855011.71969: _execute() done 18823 1726855011.71972: dumping result to json 18823 1726855011.71976: done dumping result, returning 18823 1726855011.71983: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0affcc66-ac2b-d391-077c-0000000000ad] 18823 1726855011.71991: sending task result for task 0affcc66-ac2b-d391-077c-0000000000ad skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18823 1726855011.72145: no more pending results, returning what we have 18823 1726855011.72148: results queue empty 18823 1726855011.72150: checking for any_errors_fatal 18823 1726855011.72151: done checking for any_errors_fatal 18823 1726855011.72152: checking for max_fail_percentage 18823 1726855011.72154: done checking for max_fail_percentage 18823 1726855011.72155: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.72155: done checking to see if all hosts have failed 18823 1726855011.72156: getting the remaining hosts for this loop 18823 1726855011.72157: done getting the remaining hosts for this loop 18823 1726855011.72161: getting the next task for host managed_node2 18823 1726855011.72169: done getting next task for host managed_node2 18823 1726855011.72171: ^ task is: TASK: Install yum-utils package 18823 1726855011.72174: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.72178: getting variables 18823 1726855011.72180: in VariableManager get_vars() 18823 1726855011.72213: Calling all_inventory to load vars for managed_node2 18823 1726855011.72216: Calling groups_inventory to load vars for managed_node2 18823 1726855011.72220: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.72235: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.72240: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.72244: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.72540: done sending task result for task 0affcc66-ac2b-d391-077c-0000000000ad 18823 1726855011.72544: WORKER PROCESS EXITING 18823 1726855011.72558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.72806: done with get_vars() 18823 1726855011.72814: done getting variables 18823 1726855011.72915: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:56:51 -0400 (0:00:00.022) 0:00:03.381 ****** 18823 1726855011.72942: entering _queue_task() for managed_node2/package 18823 1726855011.72944: Creating lock for package 18823 1726855011.73307: worker is 1 (out of 1 available) 18823 1726855011.73318: exiting _queue_task() for managed_node2/package 18823 1726855011.73337: done queuing things up, now waiting for results queue to drain 18823 1726855011.73339: waiting for pending results... 18823 1726855011.73529: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 18823 1726855011.73639: in run() - task 0affcc66-ac2b-d391-077c-0000000000ae 18823 1726855011.73673: variable 'ansible_search_path' from source: unknown 18823 1726855011.73677: variable 'ansible_search_path' from source: unknown 18823 1726855011.73734: calling self._execute() 18823 1726855011.73843: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.73847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.73849: variable 'omit' from source: magic vars 18823 1726855011.74259: variable 'ansible_distribution' from source: facts 18823 1726855011.74284: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18823 1726855011.74437: variable 'ansible_distribution_major_version' from source: facts 18823 1726855011.74495: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18823 1726855011.74499: when evaluation is False, skipping this task 18823 1726855011.74501: _execute() done 18823 1726855011.74503: dumping result to json 18823 1726855011.74505: done dumping result, returning 18823 1726855011.74508: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0affcc66-ac2b-d391-077c-0000000000ae] 18823 1726855011.74510: sending task result for task 0affcc66-ac2b-d391-077c-0000000000ae 18823 1726855011.74748: done sending task result for task 0affcc66-ac2b-d391-077c-0000000000ae 18823 1726855011.74752: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18823 1726855011.74793: no more pending results, returning what we have 18823 1726855011.74796: results queue empty 18823 1726855011.74797: checking for any_errors_fatal 18823 1726855011.74801: done checking for any_errors_fatal 18823 1726855011.74802: checking for max_fail_percentage 18823 1726855011.74804: done checking for max_fail_percentage 18823 1726855011.74804: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.74805: done checking to see if all hosts have failed 18823 1726855011.74806: getting the remaining hosts for this loop 18823 1726855011.74807: done getting the remaining hosts for this loop 18823 1726855011.74810: getting the next task for host managed_node2 18823 1726855011.74816: done getting next task for host managed_node2 18823 1726855011.74818: ^ task is: TASK: Enable EPEL 7 18823 1726855011.74821: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.74824: getting variables 18823 1726855011.74826: in VariableManager get_vars() 18823 1726855011.74849: Calling all_inventory to load vars for managed_node2 18823 1726855011.74852: Calling groups_inventory to load vars for managed_node2 18823 1726855011.74855: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.74865: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.74869: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.74872: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.75091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.75303: done with get_vars() 18823 1726855011.75312: done getting variables 18823 1726855011.75375: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:56:51 -0400 (0:00:00.024) 0:00:03.405 ****** 18823 1726855011.75406: entering _queue_task() for managed_node2/command 18823 1726855011.75616: worker is 1 (out of 1 available) 18823 1726855011.75629: exiting _queue_task() for managed_node2/command 18823 1726855011.75640: done queuing things up, now waiting for results queue to drain 18823 1726855011.75641: waiting for pending results... 18823 1726855011.75830: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 18823 1726855011.75943: in run() - task 0affcc66-ac2b-d391-077c-0000000000af 18823 1726855011.75952: variable 'ansible_search_path' from source: unknown 18823 1726855011.75955: variable 'ansible_search_path' from source: unknown 18823 1726855011.75983: calling self._execute() 18823 1726855011.76042: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.76046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.76054: variable 'omit' from source: magic vars 18823 1726855011.76315: variable 'ansible_distribution' from source: facts 18823 1726855011.76324: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18823 1726855011.76415: variable 'ansible_distribution_major_version' from source: facts 18823 1726855011.76418: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18823 1726855011.76421: when evaluation is False, skipping this task 18823 1726855011.76424: _execute() done 18823 1726855011.76426: dumping result to json 18823 1726855011.76428: done dumping result, returning 18823 1726855011.76434: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0affcc66-ac2b-d391-077c-0000000000af] 18823 1726855011.76439: sending task result for task 0affcc66-ac2b-d391-077c-0000000000af 18823 1726855011.76517: done sending task result for task 0affcc66-ac2b-d391-077c-0000000000af 18823 1726855011.76520: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18823 1726855011.76563: no more pending results, returning what we have 18823 1726855011.76566: results queue empty 18823 1726855011.76567: checking for any_errors_fatal 18823 1726855011.76571: done checking for any_errors_fatal 18823 1726855011.76572: checking for max_fail_percentage 18823 1726855011.76573: done checking for max_fail_percentage 18823 1726855011.76574: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.76575: done checking to see if all hosts have failed 18823 1726855011.76575: getting the remaining hosts for this loop 18823 1726855011.76577: done getting the remaining hosts for this loop 18823 1726855011.76580: getting the next task for host managed_node2 18823 1726855011.76586: done getting next task for host managed_node2 18823 1726855011.76590: ^ task is: TASK: Enable EPEL 8 18823 1726855011.76593: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.76596: getting variables 18823 1726855011.76597: in VariableManager get_vars() 18823 1726855011.76620: Calling all_inventory to load vars for managed_node2 18823 1726855011.76622: Calling groups_inventory to load vars for managed_node2 18823 1726855011.76625: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.76635: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.76638: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.76640: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.76775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.76892: done with get_vars() 18823 1726855011.76899: done getting variables 18823 1726855011.76936: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:56:51 -0400 (0:00:00.015) 0:00:03.421 ****** 18823 1726855011.76954: entering _queue_task() for managed_node2/command 18823 1726855011.77123: worker is 1 (out of 1 available) 18823 1726855011.77135: exiting _queue_task() for managed_node2/command 18823 1726855011.77146: done queuing things up, now waiting for results queue to drain 18823 1726855011.77147: waiting for pending results... 18823 1726855011.77284: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 18823 1726855011.77348: in run() - task 0affcc66-ac2b-d391-077c-0000000000b0 18823 1726855011.77358: variable 'ansible_search_path' from source: unknown 18823 1726855011.77361: variable 'ansible_search_path' from source: unknown 18823 1726855011.77391: calling self._execute() 18823 1726855011.77442: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.77445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.77453: variable 'omit' from source: magic vars 18823 1726855011.77712: variable 'ansible_distribution' from source: facts 18823 1726855011.77720: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18823 1726855011.77803: variable 'ansible_distribution_major_version' from source: facts 18823 1726855011.77808: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18823 1726855011.77812: when evaluation is False, skipping this task 18823 1726855011.77816: _execute() done 18823 1726855011.77818: dumping result to json 18823 1726855011.77821: done dumping result, returning 18823 1726855011.77833: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0affcc66-ac2b-d391-077c-0000000000b0] 18823 1726855011.77836: sending task result for task 0affcc66-ac2b-d391-077c-0000000000b0 18823 1726855011.77908: done sending task result for task 0affcc66-ac2b-d391-077c-0000000000b0 18823 1726855011.77911: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18823 1726855011.78014: no more pending results, returning what we have 18823 1726855011.78024: results queue empty 18823 1726855011.78025: checking for any_errors_fatal 18823 1726855011.78028: done checking for any_errors_fatal 18823 1726855011.78029: checking for max_fail_percentage 18823 1726855011.78031: done checking for max_fail_percentage 18823 1726855011.78031: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.78032: done checking to see if all hosts have failed 18823 1726855011.78033: getting the remaining hosts for this loop 18823 1726855011.78034: done getting the remaining hosts for this loop 18823 1726855011.78042: getting the next task for host managed_node2 18823 1726855011.78049: done getting next task for host managed_node2 18823 1726855011.78055: ^ task is: TASK: Enable EPEL 6 18823 1726855011.78067: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.78070: getting variables 18823 1726855011.78072: in VariableManager get_vars() 18823 1726855011.78096: Calling all_inventory to load vars for managed_node2 18823 1726855011.78098: Calling groups_inventory to load vars for managed_node2 18823 1726855011.78100: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.78107: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.78109: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.78111: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.78242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.78377: done with get_vars() 18823 1726855011.78386: done getting variables 18823 1726855011.78433: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:56:51 -0400 (0:00:00.014) 0:00:03.436 ****** 18823 1726855011.78456: entering _queue_task() for managed_node2/copy 18823 1726855011.78652: worker is 1 (out of 1 available) 18823 1726855011.78666: exiting _queue_task() for managed_node2/copy 18823 1726855011.78680: done queuing things up, now waiting for results queue to drain 18823 1726855011.78681: waiting for pending results... 18823 1726855011.78914: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 18823 1726855011.78980: in run() - task 0affcc66-ac2b-d391-077c-0000000000b2 18823 1726855011.78991: variable 'ansible_search_path' from source: unknown 18823 1726855011.78997: variable 'ansible_search_path' from source: unknown 18823 1726855011.79021: calling self._execute() 18823 1726855011.79070: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.79075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.79085: variable 'omit' from source: magic vars 18823 1726855011.79333: variable 'ansible_distribution' from source: facts 18823 1726855011.79343: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18823 1726855011.79421: variable 'ansible_distribution_major_version' from source: facts 18823 1726855011.79425: Evaluated conditional (ansible_distribution_major_version == '6'): False 18823 1726855011.79427: when evaluation is False, skipping this task 18823 1726855011.79431: _execute() done 18823 1726855011.79434: dumping result to json 18823 1726855011.79438: done dumping result, returning 18823 1726855011.79444: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0affcc66-ac2b-d391-077c-0000000000b2] 18823 1726855011.79449: sending task result for task 0affcc66-ac2b-d391-077c-0000000000b2 18823 1726855011.79531: done sending task result for task 0affcc66-ac2b-d391-077c-0000000000b2 18823 1726855011.79534: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18823 1726855011.79579: no more pending results, returning what we have 18823 1726855011.79582: results queue empty 18823 1726855011.79583: checking for any_errors_fatal 18823 1726855011.79586: done checking for any_errors_fatal 18823 1726855011.79589: checking for max_fail_percentage 18823 1726855011.79590: done checking for max_fail_percentage 18823 1726855011.79591: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.79592: done checking to see if all hosts have failed 18823 1726855011.79592: getting the remaining hosts for this loop 18823 1726855011.79593: done getting the remaining hosts for this loop 18823 1726855011.79598: getting the next task for host managed_node2 18823 1726855011.79606: done getting next task for host managed_node2 18823 1726855011.79608: ^ task is: TASK: Set network provider to 'nm' 18823 1726855011.79610: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.79613: getting variables 18823 1726855011.79614: in VariableManager get_vars() 18823 1726855011.79635: Calling all_inventory to load vars for managed_node2 18823 1726855011.79637: Calling groups_inventory to load vars for managed_node2 18823 1726855011.79639: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.79652: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.79654: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.79656: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.79902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.80007: done with get_vars() 18823 1726855011.80013: done getting variables 18823 1726855011.80048: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Friday 20 September 2024 13:56:51 -0400 (0:00:00.016) 0:00:03.452 ****** 18823 1726855011.80064: entering _queue_task() for managed_node2/set_fact 18823 1726855011.80216: worker is 1 (out of 1 available) 18823 1726855011.80227: exiting _queue_task() for managed_node2/set_fact 18823 1726855011.80237: done queuing things up, now waiting for results queue to drain 18823 1726855011.80237: waiting for pending results... 18823 1726855011.80377: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 18823 1726855011.80437: in run() - task 0affcc66-ac2b-d391-077c-000000000007 18823 1726855011.80447: variable 'ansible_search_path' from source: unknown 18823 1726855011.80479: calling self._execute() 18823 1726855011.80577: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.80580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.80584: variable 'omit' from source: magic vars 18823 1726855011.80700: variable 'omit' from source: magic vars 18823 1726855011.80739: variable 'omit' from source: magic vars 18823 1726855011.80778: variable 'omit' from source: magic vars 18823 1726855011.80992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855011.80999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855011.81002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855011.81004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855011.81006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855011.81007: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855011.81009: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.81011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.81066: Set connection var ansible_timeout to 10 18823 1726855011.81080: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855011.81092: Set connection var ansible_shell_type to sh 18823 1726855011.81110: Set connection var ansible_shell_executable to /bin/sh 18823 1726855011.81122: Set connection var ansible_connection to ssh 18823 1726855011.81132: Set connection var ansible_pipelining to False 18823 1726855011.81160: variable 'ansible_shell_executable' from source: unknown 18823 1726855011.81168: variable 'ansible_connection' from source: unknown 18823 1726855011.81174: variable 'ansible_module_compression' from source: unknown 18823 1726855011.81180: variable 'ansible_shell_type' from source: unknown 18823 1726855011.81186: variable 'ansible_shell_executable' from source: unknown 18823 1726855011.81198: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.81207: variable 'ansible_pipelining' from source: unknown 18823 1726855011.81215: variable 'ansible_timeout' from source: unknown 18823 1726855011.81221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.81355: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855011.81370: variable 'omit' from source: magic vars 18823 1726855011.81379: starting attempt loop 18823 1726855011.81385: running the handler 18823 1726855011.81405: handler run complete 18823 1726855011.81418: attempt loop complete, returning result 18823 1726855011.81425: _execute() done 18823 1726855011.81433: dumping result to json 18823 1726855011.81440: done dumping result, returning 18823 1726855011.81448: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0affcc66-ac2b-d391-077c-000000000007] 18823 1726855011.81456: sending task result for task 0affcc66-ac2b-d391-077c-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 18823 1726855011.81585: no more pending results, returning what we have 18823 1726855011.81590: results queue empty 18823 1726855011.81591: checking for any_errors_fatal 18823 1726855011.81598: done checking for any_errors_fatal 18823 1726855011.81598: checking for max_fail_percentage 18823 1726855011.81600: done checking for max_fail_percentage 18823 1726855011.81601: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.81601: done checking to see if all hosts have failed 18823 1726855011.81602: getting the remaining hosts for this loop 18823 1726855011.81603: done getting the remaining hosts for this loop 18823 1726855011.81606: getting the next task for host managed_node2 18823 1726855011.81613: done getting next task for host managed_node2 18823 1726855011.81614: ^ task is: TASK: meta (flush_handlers) 18823 1726855011.81616: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.81619: getting variables 18823 1726855011.81621: in VariableManager get_vars() 18823 1726855011.81649: Calling all_inventory to load vars for managed_node2 18823 1726855011.81651: Calling groups_inventory to load vars for managed_node2 18823 1726855011.81654: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.81664: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.81666: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.81669: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.81831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.82038: done with get_vars() 18823 1726855011.82047: done getting variables 18823 1726855011.82082: done sending task result for task 0affcc66-ac2b-d391-077c-000000000007 18823 1726855011.82085: WORKER PROCESS EXITING 18823 1726855011.82127: in VariableManager get_vars() 18823 1726855011.82135: Calling all_inventory to load vars for managed_node2 18823 1726855011.82138: Calling groups_inventory to load vars for managed_node2 18823 1726855011.82140: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.82144: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.82146: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.82149: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.82327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.82532: done with get_vars() 18823 1726855011.82544: done queuing things up, now waiting for results queue to drain 18823 1726855011.82545: results queue empty 18823 1726855011.82546: checking for any_errors_fatal 18823 1726855011.82548: done checking for any_errors_fatal 18823 1726855011.82548: checking for max_fail_percentage 18823 1726855011.82549: done checking for max_fail_percentage 18823 1726855011.82550: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.82551: done checking to see if all hosts have failed 18823 1726855011.82551: getting the remaining hosts for this loop 18823 1726855011.82552: done getting the remaining hosts for this loop 18823 1726855011.82554: getting the next task for host managed_node2 18823 1726855011.82557: done getting next task for host managed_node2 18823 1726855011.82558: ^ task is: TASK: meta (flush_handlers) 18823 1726855011.82560: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.82566: getting variables 18823 1726855011.82567: in VariableManager get_vars() 18823 1726855011.82573: Calling all_inventory to load vars for managed_node2 18823 1726855011.82575: Calling groups_inventory to load vars for managed_node2 18823 1726855011.82578: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.82582: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.82584: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.82589: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.82741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.82932: done with get_vars() 18823 1726855011.82948: done getting variables 18823 1726855011.82992: in VariableManager get_vars() 18823 1726855011.82999: Calling all_inventory to load vars for managed_node2 18823 1726855011.83001: Calling groups_inventory to load vars for managed_node2 18823 1726855011.83003: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.83007: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.83010: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.83012: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.83163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.83397: done with get_vars() 18823 1726855011.83408: done queuing things up, now waiting for results queue to drain 18823 1726855011.83410: results queue empty 18823 1726855011.83411: checking for any_errors_fatal 18823 1726855011.83412: done checking for any_errors_fatal 18823 1726855011.83413: checking for max_fail_percentage 18823 1726855011.83414: done checking for max_fail_percentage 18823 1726855011.83414: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.83415: done checking to see if all hosts have failed 18823 1726855011.83416: getting the remaining hosts for this loop 18823 1726855011.83416: done getting the remaining hosts for this loop 18823 1726855011.83419: getting the next task for host managed_node2 18823 1726855011.83421: done getting next task for host managed_node2 18823 1726855011.83422: ^ task is: None 18823 1726855011.83424: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.83425: done queuing things up, now waiting for results queue to drain 18823 1726855011.83426: results queue empty 18823 1726855011.83426: checking for any_errors_fatal 18823 1726855011.83427: done checking for any_errors_fatal 18823 1726855011.83428: checking for max_fail_percentage 18823 1726855011.83429: done checking for max_fail_percentage 18823 1726855011.83429: checking to see if all hosts have failed and the running result is not ok 18823 1726855011.83430: done checking to see if all hosts have failed 18823 1726855011.83432: getting the next task for host managed_node2 18823 1726855011.83434: done getting next task for host managed_node2 18823 1726855011.83435: ^ task is: None 18823 1726855011.83436: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.83480: in VariableManager get_vars() 18823 1726855011.83507: done with get_vars() 18823 1726855011.83512: in VariableManager get_vars() 18823 1726855011.83520: done with get_vars() 18823 1726855011.83524: variable 'omit' from source: magic vars 18823 1726855011.83553: in VariableManager get_vars() 18823 1726855011.83561: done with get_vars() 18823 1726855011.83579: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 18823 1726855011.83774: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855011.83808: getting the remaining hosts for this loop 18823 1726855011.83810: done getting the remaining hosts for this loop 18823 1726855011.83812: getting the next task for host managed_node2 18823 1726855011.83814: done getting next task for host managed_node2 18823 1726855011.83816: ^ task is: TASK: Gathering Facts 18823 1726855011.83817: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855011.83819: getting variables 18823 1726855011.83820: in VariableManager get_vars() 18823 1726855011.83828: Calling all_inventory to load vars for managed_node2 18823 1726855011.83830: Calling groups_inventory to load vars for managed_node2 18823 1726855011.83832: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855011.83836: Calling all_plugins_play to load vars for managed_node2 18823 1726855011.83848: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855011.83851: Calling groups_plugins_play to load vars for managed_node2 18823 1726855011.83981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855011.84153: done with get_vars() 18823 1726855011.84158: done getting variables 18823 1726855011.84204: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 13:56:51 -0400 (0:00:00.041) 0:00:03.493 ****** 18823 1726855011.84224: entering _queue_task() for managed_node2/gather_facts 18823 1726855011.84430: worker is 1 (out of 1 available) 18823 1726855011.84440: exiting _queue_task() for managed_node2/gather_facts 18823 1726855011.84450: done queuing things up, now waiting for results queue to drain 18823 1726855011.84451: waiting for pending results... 18823 1726855011.84602: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855011.84657: in run() - task 0affcc66-ac2b-d391-077c-0000000000d8 18823 1726855011.84668: variable 'ansible_search_path' from source: unknown 18823 1726855011.84704: calling self._execute() 18823 1726855011.84755: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.84758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.84766: variable 'omit' from source: magic vars 18823 1726855011.85034: variable 'ansible_distribution_major_version' from source: facts 18823 1726855011.85044: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855011.85047: variable 'omit' from source: magic vars 18823 1726855011.85067: variable 'omit' from source: magic vars 18823 1726855011.85094: variable 'omit' from source: magic vars 18823 1726855011.85134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855011.85156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855011.85172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855011.85185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855011.85196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855011.85223: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855011.85226: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.85229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.85296: Set connection var ansible_timeout to 10 18823 1726855011.85305: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855011.85307: Set connection var ansible_shell_type to sh 18823 1726855011.85312: Set connection var ansible_shell_executable to /bin/sh 18823 1726855011.85317: Set connection var ansible_connection to ssh 18823 1726855011.85324: Set connection var ansible_pipelining to False 18823 1726855011.85349: variable 'ansible_shell_executable' from source: unknown 18823 1726855011.85352: variable 'ansible_connection' from source: unknown 18823 1726855011.85355: variable 'ansible_module_compression' from source: unknown 18823 1726855011.85358: variable 'ansible_shell_type' from source: unknown 18823 1726855011.85360: variable 'ansible_shell_executable' from source: unknown 18823 1726855011.85362: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855011.85364: variable 'ansible_pipelining' from source: unknown 18823 1726855011.85366: variable 'ansible_timeout' from source: unknown 18823 1726855011.85369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855011.85495: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855011.85506: variable 'omit' from source: magic vars 18823 1726855011.85510: starting attempt loop 18823 1726855011.85513: running the handler 18823 1726855011.85526: variable 'ansible_facts' from source: unknown 18823 1726855011.85542: _low_level_execute_command(): starting 18823 1726855011.85547: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855011.86076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.86167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.86171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.86245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855011.88615: stdout chunk (state=3): >>>/root <<< 18823 1726855011.88902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855011.88905: stdout chunk (state=3): >>><<< 18823 1726855011.88908: stderr chunk (state=3): >>><<< 18823 1726855011.88911: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855011.88914: _low_level_execute_command(): starting 18823 1726855011.88916: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900 `" && echo ansible-tmp-1726855011.888329-18996-198938990990900="` echo /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900 `" ) && sleep 0' 18823 1726855011.89560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855011.89601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.89626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855011.89639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855011.89670: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.89748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855011.89797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.89890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855011.92714: stdout chunk (state=3): >>>ansible-tmp-1726855011.888329-18996-198938990990900=/root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900 <<< 18823 1726855011.92922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855011.92926: stdout chunk (state=3): >>><<< 18823 1726855011.92928: stderr chunk (state=3): >>><<< 18823 1726855011.92947: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855011.888329-18996-198938990990900=/root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855011.93096: variable 'ansible_module_compression' from source: unknown 18823 1726855011.93100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855011.93138: variable 'ansible_facts' from source: unknown 18823 1726855011.93350: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/AnsiballZ_setup.py 18823 1726855011.93757: Sending initial data 18823 1726855011.93761: Sent initial data (153 bytes) 18823 1726855011.94374: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855011.94430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.94512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855011.96764: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855011.96832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855011.96920: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmps0wddl46 /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/AnsiballZ_setup.py <<< 18823 1726855011.96923: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/AnsiballZ_setup.py" <<< 18823 1726855011.96984: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmps0wddl46" to remote "/root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/AnsiballZ_setup.py" <<< 18823 1726855011.98733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855011.98877: stderr chunk (state=3): >>><<< 18823 1726855011.98881: stdout chunk (state=3): >>><<< 18823 1726855011.98883: done transferring module to remote 18823 1726855011.98885: _low_level_execute_command(): starting 18823 1726855011.98894: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/ /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/AnsiballZ_setup.py && sleep 0' 18823 1726855011.99498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855011.99515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855011.99530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855011.99555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855011.99573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855011.99586: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855011.99673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855011.99704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855011.99733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855011.99749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855011.99865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855012.02565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855012.02577: stdout chunk (state=3): >>><<< 18823 1726855012.02657: stderr chunk (state=3): >>><<< 18823 1726855012.02661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855012.02663: _low_level_execute_command(): starting 18823 1726855012.02665: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/AnsiballZ_setup.py && sleep 0' 18823 1726855012.03352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855012.03443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855012.03478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855012.03532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855012.03550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855012.03585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855012.03718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855012.85941: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_loadavg": {"1m": 0.615234375, "5m": 0.4150390625, "15m": 0.2099609375}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 795, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794729984, "block_size": 4096, "block_total": 65519099, "block_available": 63914729, "block_used": 1604370, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "52", "epoch": "1726855012", "epoch_int": "1726855012", "date": "2024-09-20", "time": "13:56:52", "iso8601_micro": "2024-09-20T17:56:52.804074Z", "iso8601": "2024-09-20T17:56:52Z", "iso8601_basic": "20240920T135652804074", "iso8601_basic_short": "20240920T135652", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855012.88432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855012.88521: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 18823 1726855012.88524: stdout chunk (state=3): >>><<< 18823 1726855012.88527: stderr chunk (state=3): >>><<< 18823 1726855012.88567: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_loadavg": {"1m": 0.615234375, "5m": 0.4150390625, "15m": 0.2099609375}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 795, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794729984, "block_size": 4096, "block_total": 65519099, "block_available": 63914729, "block_used": 1604370, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "52", "epoch": "1726855012", "epoch_int": "1726855012", "date": "2024-09-20", "time": "13:56:52", "iso8601_micro": "2024-09-20T17:56:52.804074Z", "iso8601": "2024-09-20T17:56:52Z", "iso8601_basic": "20240920T135652804074", "iso8601_basic_short": "20240920T135652", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855012.89180: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855012.89183: _low_level_execute_command(): starting 18823 1726855012.89185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855011.888329-18996-198938990990900/ > /dev/null 2>&1 && sleep 0' 18823 1726855012.89869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855012.89892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855012.89911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855012.89930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855012.89968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855012.90005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855012.90085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855012.90113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855012.90146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855012.90223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855012.92108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855012.92133: stderr chunk (state=3): >>><<< 18823 1726855012.92137: stdout chunk (state=3): >>><<< 18823 1726855012.92152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855012.92159: handler run complete 18823 1726855012.92243: variable 'ansible_facts' from source: unknown 18823 1726855012.92310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855012.92496: variable 'ansible_facts' from source: unknown 18823 1726855012.92553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855012.92638: attempt loop complete, returning result 18823 1726855012.92642: _execute() done 18823 1726855012.92644: dumping result to json 18823 1726855012.92668: done dumping result, returning 18823 1726855012.92674: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-0000000000d8] 18823 1726855012.92678: sending task result for task 0affcc66-ac2b-d391-077c-0000000000d8 18823 1726855012.92968: done sending task result for task 0affcc66-ac2b-d391-077c-0000000000d8 18823 1726855012.92971: WORKER PROCESS EXITING ok: [managed_node2] 18823 1726855012.93172: no more pending results, returning what we have 18823 1726855012.93174: results queue empty 18823 1726855012.93175: checking for any_errors_fatal 18823 1726855012.93175: done checking for any_errors_fatal 18823 1726855012.93176: checking for max_fail_percentage 18823 1726855012.93177: done checking for max_fail_percentage 18823 1726855012.93178: checking to see if all hosts have failed and the running result is not ok 18823 1726855012.93178: done checking to see if all hosts have failed 18823 1726855012.93178: getting the remaining hosts for this loop 18823 1726855012.93179: done getting the remaining hosts for this loop 18823 1726855012.93181: getting the next task for host managed_node2 18823 1726855012.93185: done getting next task for host managed_node2 18823 1726855012.93186: ^ task is: TASK: meta (flush_handlers) 18823 1726855012.93193: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855012.93200: getting variables 18823 1726855012.93201: in VariableManager get_vars() 18823 1726855012.93217: Calling all_inventory to load vars for managed_node2 18823 1726855012.93218: Calling groups_inventory to load vars for managed_node2 18823 1726855012.93220: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855012.93228: Calling all_plugins_play to load vars for managed_node2 18823 1726855012.93229: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855012.93231: Calling groups_plugins_play to load vars for managed_node2 18823 1726855012.93336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855012.93447: done with get_vars() 18823 1726855012.93454: done getting variables 18823 1726855012.93523: in VariableManager get_vars() 18823 1726855012.93532: Calling all_inventory to load vars for managed_node2 18823 1726855012.93534: Calling groups_inventory to load vars for managed_node2 18823 1726855012.93536: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855012.93541: Calling all_plugins_play to load vars for managed_node2 18823 1726855012.93543: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855012.93546: Calling groups_plugins_play to load vars for managed_node2 18823 1726855012.93745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855012.93925: done with get_vars() 18823 1726855012.93937: done queuing things up, now waiting for results queue to drain 18823 1726855012.93938: results queue empty 18823 1726855012.93939: checking for any_errors_fatal 18823 1726855012.93942: done checking for any_errors_fatal 18823 1726855012.93948: checking for max_fail_percentage 18823 1726855012.93949: done checking for max_fail_percentage 18823 1726855012.93949: checking to see if all hosts have failed and the running result is not ok 18823 1726855012.93950: done checking to see if all hosts have failed 18823 1726855012.93951: getting the remaining hosts for this loop 18823 1726855012.93952: done getting the remaining hosts for this loop 18823 1726855012.93954: getting the next task for host managed_node2 18823 1726855012.93958: done getting next task for host managed_node2 18823 1726855012.93960: ^ task is: TASK: Show inside ethernet tests 18823 1726855012.93961: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855012.93963: getting variables 18823 1726855012.93964: in VariableManager get_vars() 18823 1726855012.93972: Calling all_inventory to load vars for managed_node2 18823 1726855012.93973: Calling groups_inventory to load vars for managed_node2 18823 1726855012.93975: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855012.93980: Calling all_plugins_play to load vars for managed_node2 18823 1726855012.93982: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855012.93991: Calling groups_plugins_play to load vars for managed_node2 18823 1726855012.94128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855012.94531: done with get_vars() 18823 1726855012.94539: done getting variables 18823 1726855012.94625: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 13:56:52 -0400 (0:00:01.104) 0:00:04.598 ****** 18823 1726855012.94653: entering _queue_task() for managed_node2/debug 18823 1726855012.94655: Creating lock for debug 18823 1726855012.95072: worker is 1 (out of 1 available) 18823 1726855012.95088: exiting _queue_task() for managed_node2/debug 18823 1726855012.95099: done queuing things up, now waiting for results queue to drain 18823 1726855012.95100: waiting for pending results... 18823 1726855012.95608: running TaskExecutor() for managed_node2/TASK: Show inside ethernet tests 18823 1726855012.95613: in run() - task 0affcc66-ac2b-d391-077c-00000000000b 18823 1726855012.95616: variable 'ansible_search_path' from source: unknown 18823 1726855012.95619: calling self._execute() 18823 1726855012.95622: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855012.95624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855012.95693: variable 'omit' from source: magic vars 18823 1726855012.95990: variable 'ansible_distribution_major_version' from source: facts 18823 1726855012.95994: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855012.96001: variable 'omit' from source: magic vars 18823 1726855012.96036: variable 'omit' from source: magic vars 18823 1726855012.96061: variable 'omit' from source: magic vars 18823 1726855012.96097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855012.96128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855012.96144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855012.96157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855012.96166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855012.96191: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855012.96194: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855012.96204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855012.96269: Set connection var ansible_timeout to 10 18823 1726855012.96272: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855012.96275: Set connection var ansible_shell_type to sh 18823 1726855012.96280: Set connection var ansible_shell_executable to /bin/sh 18823 1726855012.96286: Set connection var ansible_connection to ssh 18823 1726855012.96293: Set connection var ansible_pipelining to False 18823 1726855012.96317: variable 'ansible_shell_executable' from source: unknown 18823 1726855012.96321: variable 'ansible_connection' from source: unknown 18823 1726855012.96324: variable 'ansible_module_compression' from source: unknown 18823 1726855012.96327: variable 'ansible_shell_type' from source: unknown 18823 1726855012.96329: variable 'ansible_shell_executable' from source: unknown 18823 1726855012.96332: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855012.96334: variable 'ansible_pipelining' from source: unknown 18823 1726855012.96336: variable 'ansible_timeout' from source: unknown 18823 1726855012.96338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855012.96442: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855012.96454: variable 'omit' from source: magic vars 18823 1726855012.96458: starting attempt loop 18823 1726855012.96460: running the handler 18823 1726855012.96494: handler run complete 18823 1726855012.96513: attempt loop complete, returning result 18823 1726855012.96516: _execute() done 18823 1726855012.96519: dumping result to json 18823 1726855012.96523: done dumping result, returning 18823 1726855012.96528: done running TaskExecutor() for managed_node2/TASK: Show inside ethernet tests [0affcc66-ac2b-d391-077c-00000000000b] 18823 1726855012.96539: sending task result for task 0affcc66-ac2b-d391-077c-00000000000b 18823 1726855012.96623: done sending task result for task 0affcc66-ac2b-d391-077c-00000000000b 18823 1726855012.96626: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Inside ethernet tests 18823 1726855012.96671: no more pending results, returning what we have 18823 1726855012.96674: results queue empty 18823 1726855012.96675: checking for any_errors_fatal 18823 1726855012.96677: done checking for any_errors_fatal 18823 1726855012.96678: checking for max_fail_percentage 18823 1726855012.96679: done checking for max_fail_percentage 18823 1726855012.96680: checking to see if all hosts have failed and the running result is not ok 18823 1726855012.96681: done checking to see if all hosts have failed 18823 1726855012.96681: getting the remaining hosts for this loop 18823 1726855012.96684: done getting the remaining hosts for this loop 18823 1726855012.96689: getting the next task for host managed_node2 18823 1726855012.96708: done getting next task for host managed_node2 18823 1726855012.96712: ^ task is: TASK: Show network_provider 18823 1726855012.96713: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855012.96717: getting variables 18823 1726855012.96719: in VariableManager get_vars() 18823 1726855012.96745: Calling all_inventory to load vars for managed_node2 18823 1726855012.96748: Calling groups_inventory to load vars for managed_node2 18823 1726855012.96751: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855012.96761: Calling all_plugins_play to load vars for managed_node2 18823 1726855012.96763: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855012.96766: Calling groups_plugins_play to load vars for managed_node2 18823 1726855012.96913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855012.97027: done with get_vars() 18823 1726855012.97035: done getting variables 18823 1726855012.97074: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 13:56:52 -0400 (0:00:00.024) 0:00:04.622 ****** 18823 1726855012.97098: entering _queue_task() for managed_node2/debug 18823 1726855012.97373: worker is 1 (out of 1 available) 18823 1726855012.97383: exiting _queue_task() for managed_node2/debug 18823 1726855012.97397: done queuing things up, now waiting for results queue to drain 18823 1726855012.97398: waiting for pending results... 18823 1726855012.97852: running TaskExecutor() for managed_node2/TASK: Show network_provider 18823 1726855012.97932: in run() - task 0affcc66-ac2b-d391-077c-00000000000c 18823 1726855012.97939: variable 'ansible_search_path' from source: unknown 18823 1726855012.97947: calling self._execute() 18823 1726855012.98040: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855012.98051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855012.98064: variable 'omit' from source: magic vars 18823 1726855012.98459: variable 'ansible_distribution_major_version' from source: facts 18823 1726855012.98494: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855012.98508: variable 'omit' from source: magic vars 18823 1726855012.98540: variable 'omit' from source: magic vars 18823 1726855012.98793: variable 'omit' from source: magic vars 18823 1726855012.98796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855012.98799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855012.98801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855012.98803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855012.98805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855012.98807: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855012.98809: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855012.98810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855012.98873: Set connection var ansible_timeout to 10 18823 1726855012.98884: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855012.98893: Set connection var ansible_shell_type to sh 18823 1726855012.98904: Set connection var ansible_shell_executable to /bin/sh 18823 1726855012.98913: Set connection var ansible_connection to ssh 18823 1726855012.98935: Set connection var ansible_pipelining to False 18823 1726855012.98965: variable 'ansible_shell_executable' from source: unknown 18823 1726855012.98973: variable 'ansible_connection' from source: unknown 18823 1726855012.98981: variable 'ansible_module_compression' from source: unknown 18823 1726855012.98990: variable 'ansible_shell_type' from source: unknown 18823 1726855012.98998: variable 'ansible_shell_executable' from source: unknown 18823 1726855012.99006: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855012.99014: variable 'ansible_pipelining' from source: unknown 18823 1726855012.99021: variable 'ansible_timeout' from source: unknown 18823 1726855012.99094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855012.99295: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855012.99321: variable 'omit' from source: magic vars 18823 1726855012.99332: starting attempt loop 18823 1726855012.99338: running the handler 18823 1726855012.99396: variable 'network_provider' from source: set_fact 18823 1726855012.99500: variable 'network_provider' from source: set_fact 18823 1726855012.99527: handler run complete 18823 1726855012.99551: attempt loop complete, returning result 18823 1726855012.99557: _execute() done 18823 1726855012.99565: dumping result to json 18823 1726855012.99583: done dumping result, returning 18823 1726855012.99647: done running TaskExecutor() for managed_node2/TASK: Show network_provider [0affcc66-ac2b-d391-077c-00000000000c] 18823 1726855012.99650: sending task result for task 0affcc66-ac2b-d391-077c-00000000000c 18823 1726855012.99730: done sending task result for task 0affcc66-ac2b-d391-077c-00000000000c 18823 1726855012.99732: WORKER PROCESS EXITING ok: [managed_node2] => { "network_provider": "nm" } 18823 1726855012.99802: no more pending results, returning what we have 18823 1726855012.99806: results queue empty 18823 1726855012.99807: checking for any_errors_fatal 18823 1726855012.99814: done checking for any_errors_fatal 18823 1726855012.99815: checking for max_fail_percentage 18823 1726855012.99816: done checking for max_fail_percentage 18823 1726855012.99817: checking to see if all hosts have failed and the running result is not ok 18823 1726855012.99817: done checking to see if all hosts have failed 18823 1726855012.99818: getting the remaining hosts for this loop 18823 1726855012.99819: done getting the remaining hosts for this loop 18823 1726855012.99823: getting the next task for host managed_node2 18823 1726855012.99830: done getting next task for host managed_node2 18823 1726855012.99832: ^ task is: TASK: meta (flush_handlers) 18823 1726855012.99834: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855012.99838: getting variables 18823 1726855012.99839: in VariableManager get_vars() 18823 1726855012.99867: Calling all_inventory to load vars for managed_node2 18823 1726855012.99870: Calling groups_inventory to load vars for managed_node2 18823 1726855012.99873: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855012.99884: Calling all_plugins_play to load vars for managed_node2 18823 1726855012.99887: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855012.99892: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.00111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.00279: done with get_vars() 18823 1726855013.00292: done getting variables 18823 1726855013.00366: in VariableManager get_vars() 18823 1726855013.00375: Calling all_inventory to load vars for managed_node2 18823 1726855013.00377: Calling groups_inventory to load vars for managed_node2 18823 1726855013.00379: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855013.00383: Calling all_plugins_play to load vars for managed_node2 18823 1726855013.00385: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855013.00389: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.00529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.00727: done with get_vars() 18823 1726855013.00739: done queuing things up, now waiting for results queue to drain 18823 1726855013.00740: results queue empty 18823 1726855013.00741: checking for any_errors_fatal 18823 1726855013.00743: done checking for any_errors_fatal 18823 1726855013.00744: checking for max_fail_percentage 18823 1726855013.00745: done checking for max_fail_percentage 18823 1726855013.00745: checking to see if all hosts have failed and the running result is not ok 18823 1726855013.00746: done checking to see if all hosts have failed 18823 1726855013.00747: getting the remaining hosts for this loop 18823 1726855013.00748: done getting the remaining hosts for this loop 18823 1726855013.00750: getting the next task for host managed_node2 18823 1726855013.00757: done getting next task for host managed_node2 18823 1726855013.00759: ^ task is: TASK: meta (flush_handlers) 18823 1726855013.00760: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855013.00763: getting variables 18823 1726855013.00763: in VariableManager get_vars() 18823 1726855013.00780: Calling all_inventory to load vars for managed_node2 18823 1726855013.00782: Calling groups_inventory to load vars for managed_node2 18823 1726855013.00785: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855013.00792: Calling all_plugins_play to load vars for managed_node2 18823 1726855013.00794: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855013.00797: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.00965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.01161: done with get_vars() 18823 1726855013.01169: done getting variables 18823 1726855013.01229: in VariableManager get_vars() 18823 1726855013.01236: Calling all_inventory to load vars for managed_node2 18823 1726855013.01238: Calling groups_inventory to load vars for managed_node2 18823 1726855013.01240: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855013.01243: Calling all_plugins_play to load vars for managed_node2 18823 1726855013.01245: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855013.01247: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.01385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.01585: done with get_vars() 18823 1726855013.01597: done queuing things up, now waiting for results queue to drain 18823 1726855013.01599: results queue empty 18823 1726855013.01599: checking for any_errors_fatal 18823 1726855013.01600: done checking for any_errors_fatal 18823 1726855013.01601: checking for max_fail_percentage 18823 1726855013.01602: done checking for max_fail_percentage 18823 1726855013.01603: checking to see if all hosts have failed and the running result is not ok 18823 1726855013.01603: done checking to see if all hosts have failed 18823 1726855013.01604: getting the remaining hosts for this loop 18823 1726855013.01605: done getting the remaining hosts for this loop 18823 1726855013.01607: getting the next task for host managed_node2 18823 1726855013.01609: done getting next task for host managed_node2 18823 1726855013.01610: ^ task is: None 18823 1726855013.01611: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855013.01612: done queuing things up, now waiting for results queue to drain 18823 1726855013.01613: results queue empty 18823 1726855013.01614: checking for any_errors_fatal 18823 1726855013.01615: done checking for any_errors_fatal 18823 1726855013.01615: checking for max_fail_percentage 18823 1726855013.01616: done checking for max_fail_percentage 18823 1726855013.01617: checking to see if all hosts have failed and the running result is not ok 18823 1726855013.01617: done checking to see if all hosts have failed 18823 1726855013.01619: getting the next task for host managed_node2 18823 1726855013.01621: done getting next task for host managed_node2 18823 1726855013.01622: ^ task is: None 18823 1726855013.01623: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855013.01676: in VariableManager get_vars() 18823 1726855013.01694: done with get_vars() 18823 1726855013.01701: in VariableManager get_vars() 18823 1726855013.01711: done with get_vars() 18823 1726855013.01715: variable 'omit' from source: magic vars 18823 1726855013.01742: in VariableManager get_vars() 18823 1726855013.01766: done with get_vars() 18823 1726855013.01786: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 18823 1726855013.01944: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855013.01961: getting the remaining hosts for this loop 18823 1726855013.01962: done getting the remaining hosts for this loop 18823 1726855013.01964: getting the next task for host managed_node2 18823 1726855013.01966: done getting next task for host managed_node2 18823 1726855013.01967: ^ task is: TASK: Gathering Facts 18823 1726855013.01968: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855013.01969: getting variables 18823 1726855013.01970: in VariableManager get_vars() 18823 1726855013.01979: Calling all_inventory to load vars for managed_node2 18823 1726855013.01982: Calling groups_inventory to load vars for managed_node2 18823 1726855013.01984: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855013.01991: Calling all_plugins_play to load vars for managed_node2 18823 1726855013.01994: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855013.01997: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.02134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.02298: done with get_vars() 18823 1726855013.02310: done getting variables 18823 1726855013.02348: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 13:56:53 -0400 (0:00:00.052) 0:00:04.675 ****** 18823 1726855013.02372: entering _queue_task() for managed_node2/gather_facts 18823 1726855013.02655: worker is 1 (out of 1 available) 18823 1726855013.02667: exiting _queue_task() for managed_node2/gather_facts 18823 1726855013.02677: done queuing things up, now waiting for results queue to drain 18823 1726855013.02678: waiting for pending results... 18823 1726855013.03007: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855013.03031: in run() - task 0affcc66-ac2b-d391-077c-0000000000f0 18823 1726855013.03049: variable 'ansible_search_path' from source: unknown 18823 1726855013.03076: calling self._execute() 18823 1726855013.03146: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855013.03156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855013.03160: variable 'omit' from source: magic vars 18823 1726855013.03693: variable 'ansible_distribution_major_version' from source: facts 18823 1726855013.03697: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855013.03700: variable 'omit' from source: magic vars 18823 1726855013.03703: variable 'omit' from source: magic vars 18823 1726855013.03705: variable 'omit' from source: magic vars 18823 1726855013.03708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855013.03710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855013.03713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855013.03715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855013.03717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855013.03731: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855013.03738: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855013.03746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855013.03842: Set connection var ansible_timeout to 10 18823 1726855013.03855: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855013.03862: Set connection var ansible_shell_type to sh 18823 1726855013.03871: Set connection var ansible_shell_executable to /bin/sh 18823 1726855013.03879: Set connection var ansible_connection to ssh 18823 1726855013.03891: Set connection var ansible_pipelining to False 18823 1726855013.03922: variable 'ansible_shell_executable' from source: unknown 18823 1726855013.03930: variable 'ansible_connection' from source: unknown 18823 1726855013.03936: variable 'ansible_module_compression' from source: unknown 18823 1726855013.03943: variable 'ansible_shell_type' from source: unknown 18823 1726855013.03949: variable 'ansible_shell_executable' from source: unknown 18823 1726855013.03956: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855013.03962: variable 'ansible_pipelining' from source: unknown 18823 1726855013.03968: variable 'ansible_timeout' from source: unknown 18823 1726855013.03975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855013.04153: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855013.04168: variable 'omit' from source: magic vars 18823 1726855013.04179: starting attempt loop 18823 1726855013.04186: running the handler 18823 1726855013.04211: variable 'ansible_facts' from source: unknown 18823 1726855013.04235: _low_level_execute_command(): starting 18823 1726855013.04249: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855013.04785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855013.04825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855013.04828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855013.04831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855013.04877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855013.04880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855013.04889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855013.04963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855013.07055: stdout chunk (state=3): >>>/root <<< 18823 1726855013.07202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855013.07241: stderr chunk (state=3): >>><<< 18823 1726855013.07243: stdout chunk (state=3): >>><<< 18823 1726855013.07257: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855013.07302: _low_level_execute_command(): starting 18823 1726855013.07309: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475 `" && echo ansible-tmp-1726855013.0726187-19046-40852105559475="` echo /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475 `" ) && sleep 0' 18823 1726855013.07743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855013.07746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855013.07748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855013.07757: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855013.07760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855013.07804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855013.07807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855013.07889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855013.10629: stdout chunk (state=3): >>>ansible-tmp-1726855013.0726187-19046-40852105559475=/root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475 <<< 18823 1726855013.10846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855013.10850: stdout chunk (state=3): >>><<< 18823 1726855013.10853: stderr chunk (state=3): >>><<< 18823 1726855013.10993: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855013.0726187-19046-40852105559475=/root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855013.11000: variable 'ansible_module_compression' from source: unknown 18823 1726855013.11002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855013.11044: variable 'ansible_facts' from source: unknown 18823 1726855013.11309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/AnsiballZ_setup.py 18823 1726855013.11445: Sending initial data 18823 1726855013.11457: Sent initial data (153 bytes) 18823 1726855013.11871: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855013.11900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855013.11940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855013.11953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855013.12032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855013.14276: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855013.14364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855013.14437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmppxoqonp1 /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/AnsiballZ_setup.py <<< 18823 1726855013.14440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/AnsiballZ_setup.py" <<< 18823 1726855013.14514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmppxoqonp1" to remote "/root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/AnsiballZ_setup.py" <<< 18823 1726855013.16284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855013.16326: stderr chunk (state=3): >>><<< 18823 1726855013.16433: stdout chunk (state=3): >>><<< 18823 1726855013.16437: done transferring module to remote 18823 1726855013.16439: _low_level_execute_command(): starting 18823 1726855013.16442: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/ /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/AnsiballZ_setup.py && sleep 0' 18823 1726855013.17067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855013.17083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855013.17195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855013.17224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855013.17335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855013.19977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855013.19981: stdout chunk (state=3): >>><<< 18823 1726855013.19983: stderr chunk (state=3): >>><<< 18823 1726855013.20098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18823 1726855013.20104: _low_level_execute_command(): starting 18823 1726855013.20106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/AnsiballZ_setup.py && sleep 0' 18823 1726855013.20690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855013.20764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855013.20828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855013.20846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855013.20880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855013.21014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18823 1726855013.88882: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.615234375, "5m": 0.4150390625, "15m": 0.2099609375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "53", "epoch": "1726855013", "epoch_int": "1726855013", "date": "2024-09-20", "time": "13:56:53", "iso8601_micro": "2024-09-20T17:56:53.535392Z", "iso8601": "2024-09-20T17:56:53Z", "iso8601_basic": "20240920T135653535392", "iso8601_basic_short": "20240920T135653", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2940, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 591, "free": 2940}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 796, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795004416, "block_size": 4096, "block_total": 65519099, "block_available": 63914796, "block_used": 1604303, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855013.91742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855013.91746: stdout chunk (state=3): >>><<< 18823 1726855013.91749: stderr chunk (state=3): >>><<< 18823 1726855013.91752: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.615234375, "5m": 0.4150390625, "15m": 0.2099609375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "53", "epoch": "1726855013", "epoch_int": "1726855013", "date": "2024-09-20", "time": "13:56:53", "iso8601_micro": "2024-09-20T17:56:53.535392Z", "iso8601": "2024-09-20T17:56:53Z", "iso8601_basic": "20240920T135653535392", "iso8601_basic_short": "20240920T135653", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2940, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 591, "free": 2940}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 796, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795004416, "block_size": 4096, "block_total": 65519099, "block_available": 63914796, "block_used": 1604303, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855013.92180: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855013.92220: _low_level_execute_command(): starting 18823 1726855013.92231: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855013.0726187-19046-40852105559475/ > /dev/null 2>&1 && sleep 0' 18823 1726855013.93584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855013.93830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855013.93895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855013.95780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855013.95794: stdout chunk (state=3): >>><<< 18823 1726855013.95807: stderr chunk (state=3): >>><<< 18823 1726855013.95829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855013.96094: handler run complete 18823 1726855013.96129: variable 'ansible_facts' from source: unknown 18823 1726855013.96232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.96894: variable 'ansible_facts' from source: unknown 18823 1726855013.97055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.97197: attempt loop complete, returning result 18823 1726855013.97207: _execute() done 18823 1726855013.97218: dumping result to json 18823 1726855013.97248: done dumping result, returning 18823 1726855013.97259: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-0000000000f0] 18823 1726855013.97268: sending task result for task 0affcc66-ac2b-d391-077c-0000000000f0 ok: [managed_node2] 18823 1726855013.98244: no more pending results, returning what we have 18823 1726855013.98248: results queue empty 18823 1726855013.98249: checking for any_errors_fatal 18823 1726855013.98250: done checking for any_errors_fatal 18823 1726855013.98251: checking for max_fail_percentage 18823 1726855013.98252: done checking for max_fail_percentage 18823 1726855013.98253: checking to see if all hosts have failed and the running result is not ok 18823 1726855013.98254: done checking to see if all hosts have failed 18823 1726855013.98254: getting the remaining hosts for this loop 18823 1726855013.98256: done getting the remaining hosts for this loop 18823 1726855013.98259: getting the next task for host managed_node2 18823 1726855013.98264: done getting next task for host managed_node2 18823 1726855013.98266: ^ task is: TASK: meta (flush_handlers) 18823 1726855013.98268: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855013.98272: getting variables 18823 1726855013.98273: in VariableManager get_vars() 18823 1726855013.98299: Calling all_inventory to load vars for managed_node2 18823 1726855013.98302: Calling groups_inventory to load vars for managed_node2 18823 1726855013.98305: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855013.98341: done sending task result for task 0affcc66-ac2b-d391-077c-0000000000f0 18823 1726855013.98344: WORKER PROCESS EXITING 18823 1726855013.98354: Calling all_plugins_play to load vars for managed_node2 18823 1726855013.98357: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855013.98359: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.98510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.98701: done with get_vars() 18823 1726855013.98712: done getting variables 18823 1726855013.98793: in VariableManager get_vars() 18823 1726855013.98802: Calling all_inventory to load vars for managed_node2 18823 1726855013.98805: Calling groups_inventory to load vars for managed_node2 18823 1726855013.98807: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855013.98811: Calling all_plugins_play to load vars for managed_node2 18823 1726855013.98813: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855013.98816: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.98983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.99163: done with get_vars() 18823 1726855013.99176: done queuing things up, now waiting for results queue to drain 18823 1726855013.99177: results queue empty 18823 1726855013.99178: checking for any_errors_fatal 18823 1726855013.99181: done checking for any_errors_fatal 18823 1726855013.99182: checking for max_fail_percentage 18823 1726855013.99183: done checking for max_fail_percentage 18823 1726855013.99183: checking to see if all hosts have failed and the running result is not ok 18823 1726855013.99184: done checking to see if all hosts have failed 18823 1726855013.99185: getting the remaining hosts for this loop 18823 1726855013.99195: done getting the remaining hosts for this loop 18823 1726855013.99198: getting the next task for host managed_node2 18823 1726855013.99202: done getting next task for host managed_node2 18823 1726855013.99204: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 18823 1726855013.99205: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855013.99207: getting variables 18823 1726855013.99208: in VariableManager get_vars() 18823 1726855013.99215: Calling all_inventory to load vars for managed_node2 18823 1726855013.99217: Calling groups_inventory to load vars for managed_node2 18823 1726855013.99219: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855013.99224: Calling all_plugins_play to load vars for managed_node2 18823 1726855013.99226: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855013.99229: Calling groups_plugins_play to load vars for managed_node2 18823 1726855013.99371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855013.99554: done with get_vars() 18823 1726855013.99562: done getting variables 18823 1726855013.99602: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855013.99749: variable 'type' from source: play vars 18823 1726855013.99754: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 13:56:53 -0400 (0:00:00.974) 0:00:05.649 ****** 18823 1726855013.99794: entering _queue_task() for managed_node2/set_fact 18823 1726855014.00214: worker is 1 (out of 1 available) 18823 1726855014.00224: exiting _queue_task() for managed_node2/set_fact 18823 1726855014.00234: done queuing things up, now waiting for results queue to drain 18823 1726855014.00235: waiting for pending results... 18823 1726855014.00386: running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=lsr27 18823 1726855014.00494: in run() - task 0affcc66-ac2b-d391-077c-00000000000f 18823 1726855014.00514: variable 'ansible_search_path' from source: unknown 18823 1726855014.00553: calling self._execute() 18823 1726855014.00633: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.00645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.00661: variable 'omit' from source: magic vars 18823 1726855014.01024: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.01041: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.01053: variable 'omit' from source: magic vars 18823 1726855014.01085: variable 'omit' from source: magic vars 18823 1726855014.01126: variable 'type' from source: play vars 18823 1726855014.01200: variable 'type' from source: play vars 18823 1726855014.01224: variable 'interface' from source: play vars 18823 1726855014.01293: variable 'interface' from source: play vars 18823 1726855014.01316: variable 'omit' from source: magic vars 18823 1726855014.01368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855014.01442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855014.01445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855014.01464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.01481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.01518: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855014.01528: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.01550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.01647: Set connection var ansible_timeout to 10 18823 1726855014.01765: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855014.01768: Set connection var ansible_shell_type to sh 18823 1726855014.01771: Set connection var ansible_shell_executable to /bin/sh 18823 1726855014.01772: Set connection var ansible_connection to ssh 18823 1726855014.01775: Set connection var ansible_pipelining to False 18823 1726855014.01777: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.01778: variable 'ansible_connection' from source: unknown 18823 1726855014.01780: variable 'ansible_module_compression' from source: unknown 18823 1726855014.01782: variable 'ansible_shell_type' from source: unknown 18823 1726855014.01784: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.01785: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.01789: variable 'ansible_pipelining' from source: unknown 18823 1726855014.01791: variable 'ansible_timeout' from source: unknown 18823 1726855014.01793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.01912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855014.01926: variable 'omit' from source: magic vars 18823 1726855014.01992: starting attempt loop 18823 1726855014.01995: running the handler 18823 1726855014.01998: handler run complete 18823 1726855014.02000: attempt loop complete, returning result 18823 1726855014.02003: _execute() done 18823 1726855014.02005: dumping result to json 18823 1726855014.02007: done dumping result, returning 18823 1726855014.02296: done running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=lsr27 [0affcc66-ac2b-d391-077c-00000000000f] 18823 1726855014.02301: sending task result for task 0affcc66-ac2b-d391-077c-00000000000f 18823 1726855014.02362: done sending task result for task 0affcc66-ac2b-d391-077c-00000000000f 18823 1726855014.02365: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 18823 1726855014.02425: no more pending results, returning what we have 18823 1726855014.02429: results queue empty 18823 1726855014.02430: checking for any_errors_fatal 18823 1726855014.02432: done checking for any_errors_fatal 18823 1726855014.02433: checking for max_fail_percentage 18823 1726855014.02435: done checking for max_fail_percentage 18823 1726855014.02436: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.02436: done checking to see if all hosts have failed 18823 1726855014.02437: getting the remaining hosts for this loop 18823 1726855014.02439: done getting the remaining hosts for this loop 18823 1726855014.02443: getting the next task for host managed_node2 18823 1726855014.02451: done getting next task for host managed_node2 18823 1726855014.02453: ^ task is: TASK: Include the task 'show_interfaces.yml' 18823 1726855014.02455: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.02459: getting variables 18823 1726855014.02461: in VariableManager get_vars() 18823 1726855014.02498: Calling all_inventory to load vars for managed_node2 18823 1726855014.02502: Calling groups_inventory to load vars for managed_node2 18823 1726855014.02506: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.02519: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.02523: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.02527: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.03224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.03715: done with get_vars() 18823 1726855014.03726: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 13:56:54 -0400 (0:00:00.041) 0:00:05.690 ****** 18823 1726855014.03927: entering _queue_task() for managed_node2/include_tasks 18823 1726855014.04437: worker is 1 (out of 1 available) 18823 1726855014.04450: exiting _queue_task() for managed_node2/include_tasks 18823 1726855014.04461: done queuing things up, now waiting for results queue to drain 18823 1726855014.04462: waiting for pending results... 18823 1726855014.05107: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 18823 1726855014.05113: in run() - task 0affcc66-ac2b-d391-077c-000000000010 18823 1726855014.05122: variable 'ansible_search_path' from source: unknown 18823 1726855014.05163: calling self._execute() 18823 1726855014.05280: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.05428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.05446: variable 'omit' from source: magic vars 18823 1726855014.06140: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.06395: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.06400: _execute() done 18823 1726855014.06404: dumping result to json 18823 1726855014.06407: done dumping result, returning 18823 1726855014.06409: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-d391-077c-000000000010] 18823 1726855014.06412: sending task result for task 0affcc66-ac2b-d391-077c-000000000010 18823 1726855014.06486: done sending task result for task 0affcc66-ac2b-d391-077c-000000000010 18823 1726855014.06492: WORKER PROCESS EXITING 18823 1726855014.06527: no more pending results, returning what we have 18823 1726855014.06532: in VariableManager get_vars() 18823 1726855014.06570: Calling all_inventory to load vars for managed_node2 18823 1726855014.06573: Calling groups_inventory to load vars for managed_node2 18823 1726855014.06577: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.06594: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.06599: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.06602: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.07102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.07593: done with get_vars() 18823 1726855014.07601: variable 'ansible_search_path' from source: unknown 18823 1726855014.07615: we have included files to process 18823 1726855014.07616: generating all_blocks data 18823 1726855014.07617: done generating all_blocks data 18823 1726855014.07618: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18823 1726855014.07619: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18823 1726855014.07621: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18823 1726855014.08063: in VariableManager get_vars() 18823 1726855014.08079: done with get_vars() 18823 1726855014.08397: done processing included file 18823 1726855014.08399: iterating over new_blocks loaded from include file 18823 1726855014.08400: in VariableManager get_vars() 18823 1726855014.08411: done with get_vars() 18823 1726855014.08413: filtering new block on tags 18823 1726855014.08429: done filtering new block on tags 18823 1726855014.08431: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 18823 1726855014.08436: extending task lists for all hosts with included blocks 18823 1726855014.08617: done extending task lists 18823 1726855014.08619: done processing included files 18823 1726855014.08620: results queue empty 18823 1726855014.08620: checking for any_errors_fatal 18823 1726855014.08624: done checking for any_errors_fatal 18823 1726855014.08625: checking for max_fail_percentage 18823 1726855014.08626: done checking for max_fail_percentage 18823 1726855014.08626: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.08627: done checking to see if all hosts have failed 18823 1726855014.08628: getting the remaining hosts for this loop 18823 1726855014.08629: done getting the remaining hosts for this loop 18823 1726855014.08631: getting the next task for host managed_node2 18823 1726855014.08634: done getting next task for host managed_node2 18823 1726855014.08637: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18823 1726855014.08639: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.08641: getting variables 18823 1726855014.08642: in VariableManager get_vars() 18823 1726855014.09204: Calling all_inventory to load vars for managed_node2 18823 1726855014.09207: Calling groups_inventory to load vars for managed_node2 18823 1726855014.09210: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.09215: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.09217: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.09220: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.09352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.09734: done with get_vars() 18823 1726855014.09742: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:56:54 -0400 (0:00:00.060) 0:00:05.751 ****** 18823 1726855014.09999: entering _queue_task() for managed_node2/include_tasks 18823 1726855014.10496: worker is 1 (out of 1 available) 18823 1726855014.10507: exiting _queue_task() for managed_node2/include_tasks 18823 1726855014.10516: done queuing things up, now waiting for results queue to drain 18823 1726855014.10517: waiting for pending results... 18823 1726855014.10655: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 18823 1726855014.10772: in run() - task 0affcc66-ac2b-d391-077c-000000000104 18823 1726855014.10794: variable 'ansible_search_path' from source: unknown 18823 1726855014.10854: variable 'ansible_search_path' from source: unknown 18823 1726855014.10858: calling self._execute() 18823 1726855014.10929: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.10940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.10960: variable 'omit' from source: magic vars 18823 1726855014.11345: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.11362: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.11373: _execute() done 18823 1726855014.11382: dumping result to json 18823 1726855014.11400: done dumping result, returning 18823 1726855014.11504: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-d391-077c-000000000104] 18823 1726855014.11508: sending task result for task 0affcc66-ac2b-d391-077c-000000000104 18823 1726855014.11574: done sending task result for task 0affcc66-ac2b-d391-077c-000000000104 18823 1726855014.11578: WORKER PROCESS EXITING 18823 1726855014.11610: no more pending results, returning what we have 18823 1726855014.11615: in VariableManager get_vars() 18823 1726855014.11648: Calling all_inventory to load vars for managed_node2 18823 1726855014.11652: Calling groups_inventory to load vars for managed_node2 18823 1726855014.11655: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.11671: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.11674: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.11678: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.11970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.12294: done with get_vars() 18823 1726855014.12301: variable 'ansible_search_path' from source: unknown 18823 1726855014.12302: variable 'ansible_search_path' from source: unknown 18823 1726855014.12338: we have included files to process 18823 1726855014.12339: generating all_blocks data 18823 1726855014.12340: done generating all_blocks data 18823 1726855014.12341: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18823 1726855014.12342: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18823 1726855014.12344: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18823 1726855014.12651: done processing included file 18823 1726855014.12653: iterating over new_blocks loaded from include file 18823 1726855014.12655: in VariableManager get_vars() 18823 1726855014.12672: done with get_vars() 18823 1726855014.12674: filtering new block on tags 18823 1726855014.12691: done filtering new block on tags 18823 1726855014.12694: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 18823 1726855014.12698: extending task lists for all hosts with included blocks 18823 1726855014.12797: done extending task lists 18823 1726855014.12799: done processing included files 18823 1726855014.12799: results queue empty 18823 1726855014.12800: checking for any_errors_fatal 18823 1726855014.12803: done checking for any_errors_fatal 18823 1726855014.12803: checking for max_fail_percentage 18823 1726855014.12804: done checking for max_fail_percentage 18823 1726855014.12805: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.12806: done checking to see if all hosts have failed 18823 1726855014.12806: getting the remaining hosts for this loop 18823 1726855014.12807: done getting the remaining hosts for this loop 18823 1726855014.12809: getting the next task for host managed_node2 18823 1726855014.12813: done getting next task for host managed_node2 18823 1726855014.12815: ^ task is: TASK: Gather current interface info 18823 1726855014.12818: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.12819: getting variables 18823 1726855014.12820: in VariableManager get_vars() 18823 1726855014.12827: Calling all_inventory to load vars for managed_node2 18823 1726855014.12829: Calling groups_inventory to load vars for managed_node2 18823 1726855014.12831: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.12835: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.12837: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.12840: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.12962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.13152: done with get_vars() 18823 1726855014.13161: done getting variables 18823 1726855014.13200: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:56:54 -0400 (0:00:00.032) 0:00:05.784 ****** 18823 1726855014.13233: entering _queue_task() for managed_node2/command 18823 1726855014.13516: worker is 1 (out of 1 available) 18823 1726855014.13527: exiting _queue_task() for managed_node2/command 18823 1726855014.13651: done queuing things up, now waiting for results queue to drain 18823 1726855014.13652: waiting for pending results... 18823 1726855014.13876: running TaskExecutor() for managed_node2/TASK: Gather current interface info 18823 1726855014.13974: in run() - task 0affcc66-ac2b-d391-077c-000000000115 18823 1726855014.13979: variable 'ansible_search_path' from source: unknown 18823 1726855014.13981: variable 'ansible_search_path' from source: unknown 18823 1726855014.13984: calling self._execute() 18823 1726855014.14047: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.14057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.14070: variable 'omit' from source: magic vars 18823 1726855014.14499: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.14522: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.14537: variable 'omit' from source: magic vars 18823 1726855014.14585: variable 'omit' from source: magic vars 18823 1726855014.14630: variable 'omit' from source: magic vars 18823 1726855014.14676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855014.14718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855014.14751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855014.14792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.14795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.14825: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855014.14835: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.14861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.14955: Set connection var ansible_timeout to 10 18823 1726855014.14970: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855014.14992: Set connection var ansible_shell_type to sh 18823 1726855014.14995: Set connection var ansible_shell_executable to /bin/sh 18823 1726855014.14998: Set connection var ansible_connection to ssh 18823 1726855014.15061: Set connection var ansible_pipelining to False 18823 1726855014.15064: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.15067: variable 'ansible_connection' from source: unknown 18823 1726855014.15070: variable 'ansible_module_compression' from source: unknown 18823 1726855014.15074: variable 'ansible_shell_type' from source: unknown 18823 1726855014.15076: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.15078: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.15080: variable 'ansible_pipelining' from source: unknown 18823 1726855014.15081: variable 'ansible_timeout' from source: unknown 18823 1726855014.15084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.15280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855014.15284: variable 'omit' from source: magic vars 18823 1726855014.15286: starting attempt loop 18823 1726855014.15290: running the handler 18823 1726855014.15292: _low_level_execute_command(): starting 18823 1726855014.15294: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855014.16007: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855014.16050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.16063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855014.16084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855014.16156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.16194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855014.16240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855014.16372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855014.18053: stdout chunk (state=3): >>>/root <<< 18823 1726855014.18291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855014.18320: stderr chunk (state=3): >>><<< 18823 1726855014.18323: stdout chunk (state=3): >>><<< 18823 1726855014.18344: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855014.18365: _low_level_execute_command(): starting 18823 1726855014.18401: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518 `" && echo ansible-tmp-1726855014.1835089-19105-181720929526518="` echo /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518 `" ) && sleep 0' 18823 1726855014.19615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855014.19619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855014.19622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.19684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855014.19803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.19816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855014.19852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855014.19952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855014.21924: stdout chunk (state=3): >>>ansible-tmp-1726855014.1835089-19105-181720929526518=/root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518 <<< 18823 1726855014.22036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855014.22074: stderr chunk (state=3): >>><<< 18823 1726855014.22085: stdout chunk (state=3): >>><<< 18823 1726855014.22108: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855014.1835089-19105-181720929526518=/root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855014.22295: variable 'ansible_module_compression' from source: unknown 18823 1726855014.22325: ANSIBALLZ: Using generic lock for ansible.legacy.command 18823 1726855014.22332: ANSIBALLZ: Acquiring lock 18823 1726855014.22338: ANSIBALLZ: Lock acquired: 140142269228544 18823 1726855014.22343: ANSIBALLZ: Creating module 18823 1726855014.46197: ANSIBALLZ: Writing module into payload 18823 1726855014.46302: ANSIBALLZ: Writing module 18823 1726855014.46334: ANSIBALLZ: Renaming module 18823 1726855014.46344: ANSIBALLZ: Done creating module 18823 1726855014.46365: variable 'ansible_facts' from source: unknown 18823 1726855014.46452: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/AnsiballZ_command.py 18823 1726855014.46609: Sending initial data 18823 1726855014.46618: Sent initial data (156 bytes) 18823 1726855014.47337: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855014.47439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.47474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855014.47504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855014.47545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855014.47652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855014.49424: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855014.49514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855014.49617: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpipfjelp4 /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/AnsiballZ_command.py <<< 18823 1726855014.49632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/AnsiballZ_command.py" <<< 18823 1726855014.49677: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpipfjelp4" to remote "/root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/AnsiballZ_command.py" <<< 18823 1726855014.50759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855014.50762: stdout chunk (state=3): >>><<< 18823 1726855014.50764: stderr chunk (state=3): >>><<< 18823 1726855014.50766: done transferring module to remote 18823 1726855014.51078: _low_level_execute_command(): starting 18823 1726855014.51081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/ /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/AnsiballZ_command.py && sleep 0' 18823 1726855014.52081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855014.52084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855014.52093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855014.52095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.52149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855014.52171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855014.52253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855014.54185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855014.54191: stdout chunk (state=3): >>><<< 18823 1726855014.54194: stderr chunk (state=3): >>><<< 18823 1726855014.54210: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855014.54214: _low_level_execute_command(): starting 18823 1726855014.54218: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/AnsiballZ_command.py && sleep 0' 18823 1726855014.55189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.55473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855014.55486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855014.55585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855014.71123: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:56:54.705576", "end": "2024-09-20 13:56:54.708892", "delta": "0:00:00.003316", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855014.72610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855014.72637: stderr chunk (state=3): >>><<< 18823 1726855014.72640: stdout chunk (state=3): >>><<< 18823 1726855014.72655: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:56:54.705576", "end": "2024-09-20 13:56:54.708892", "delta": "0:00:00.003316", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855014.72690: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855014.72697: _low_level_execute_command(): starting 18823 1726855014.72704: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855014.1835089-19105-181720929526518/ > /dev/null 2>&1 && sleep 0' 18823 1726855014.73150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855014.73153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.73161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855014.73163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855014.73168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855014.73218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855014.73221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855014.73310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855014.75141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855014.75166: stderr chunk (state=3): >>><<< 18823 1726855014.75169: stdout chunk (state=3): >>><<< 18823 1726855014.75182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855014.75189: handler run complete 18823 1726855014.75207: Evaluated conditional (False): False 18823 1726855014.75217: attempt loop complete, returning result 18823 1726855014.75219: _execute() done 18823 1726855014.75222: dumping result to json 18823 1726855014.75227: done dumping result, returning 18823 1726855014.75234: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcc66-ac2b-d391-077c-000000000115] 18823 1726855014.75239: sending task result for task 0affcc66-ac2b-d391-077c-000000000115 18823 1726855014.75333: done sending task result for task 0affcc66-ac2b-d391-077c-000000000115 18823 1726855014.75336: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003316", "end": "2024-09-20 13:56:54.708892", "rc": 0, "start": "2024-09-20 13:56:54.705576" } STDOUT: bonding_masters eth0 lo 18823 1726855014.75405: no more pending results, returning what we have 18823 1726855014.75410: results queue empty 18823 1726855014.75410: checking for any_errors_fatal 18823 1726855014.75412: done checking for any_errors_fatal 18823 1726855014.75413: checking for max_fail_percentage 18823 1726855014.75414: done checking for max_fail_percentage 18823 1726855014.75415: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.75416: done checking to see if all hosts have failed 18823 1726855014.75416: getting the remaining hosts for this loop 18823 1726855014.75418: done getting the remaining hosts for this loop 18823 1726855014.75421: getting the next task for host managed_node2 18823 1726855014.75428: done getting next task for host managed_node2 18823 1726855014.75430: ^ task is: TASK: Set current_interfaces 18823 1726855014.75434: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.75437: getting variables 18823 1726855014.75498: in VariableManager get_vars() 18823 1726855014.75523: Calling all_inventory to load vars for managed_node2 18823 1726855014.75526: Calling groups_inventory to load vars for managed_node2 18823 1726855014.75529: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.75539: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.75542: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.75544: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.75713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.75898: done with get_vars() 18823 1726855014.75912: done getting variables 18823 1726855014.76011: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:56:54 -0400 (0:00:00.628) 0:00:06.412 ****** 18823 1726855014.76051: entering _queue_task() for managed_node2/set_fact 18823 1726855014.76480: worker is 1 (out of 1 available) 18823 1726855014.76494: exiting _queue_task() for managed_node2/set_fact 18823 1726855014.76510: done queuing things up, now waiting for results queue to drain 18823 1726855014.76511: waiting for pending results... 18823 1726855014.76920: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 18823 1726855014.76924: in run() - task 0affcc66-ac2b-d391-077c-000000000116 18823 1726855014.76927: variable 'ansible_search_path' from source: unknown 18823 1726855014.76933: variable 'ansible_search_path' from source: unknown 18823 1726855014.77020: calling self._execute() 18823 1726855014.77058: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.77073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.77094: variable 'omit' from source: magic vars 18823 1726855014.77510: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.77530: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.77560: variable 'omit' from source: magic vars 18823 1726855014.77609: variable 'omit' from source: magic vars 18823 1726855014.77691: variable '_current_interfaces' from source: set_fact 18823 1726855014.77740: variable 'omit' from source: magic vars 18823 1726855014.77777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855014.77860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855014.77864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855014.77867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.77869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.77900: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855014.77905: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.77908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.77985: Set connection var ansible_timeout to 10 18823 1726855014.78008: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855014.78013: Set connection var ansible_shell_type to sh 18823 1726855014.78015: Set connection var ansible_shell_executable to /bin/sh 18823 1726855014.78019: Set connection var ansible_connection to ssh 18823 1726855014.78025: Set connection var ansible_pipelining to False 18823 1726855014.78044: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.78047: variable 'ansible_connection' from source: unknown 18823 1726855014.78050: variable 'ansible_module_compression' from source: unknown 18823 1726855014.78056: variable 'ansible_shell_type' from source: unknown 18823 1726855014.78058: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.78060: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.78063: variable 'ansible_pipelining' from source: unknown 18823 1726855014.78065: variable 'ansible_timeout' from source: unknown 18823 1726855014.78192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.78211: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855014.78225: variable 'omit' from source: magic vars 18823 1726855014.78233: starting attempt loop 18823 1726855014.78238: running the handler 18823 1726855014.78250: handler run complete 18823 1726855014.78261: attempt loop complete, returning result 18823 1726855014.78268: _execute() done 18823 1726855014.78274: dumping result to json 18823 1726855014.78282: done dumping result, returning 18823 1726855014.78297: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcc66-ac2b-d391-077c-000000000116] 18823 1726855014.78307: sending task result for task 0affcc66-ac2b-d391-077c-000000000116 18823 1726855014.78403: done sending task result for task 0affcc66-ac2b-d391-077c-000000000116 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18823 1726855014.78478: no more pending results, returning what we have 18823 1726855014.78481: results queue empty 18823 1726855014.78482: checking for any_errors_fatal 18823 1726855014.78492: done checking for any_errors_fatal 18823 1726855014.78493: checking for max_fail_percentage 18823 1726855014.78495: done checking for max_fail_percentage 18823 1726855014.78495: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.78496: done checking to see if all hosts have failed 18823 1726855014.78496: getting the remaining hosts for this loop 18823 1726855014.78498: done getting the remaining hosts for this loop 18823 1726855014.78503: getting the next task for host managed_node2 18823 1726855014.78511: done getting next task for host managed_node2 18823 1726855014.78513: ^ task is: TASK: Show current_interfaces 18823 1726855014.78516: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.78520: getting variables 18823 1726855014.78522: in VariableManager get_vars() 18823 1726855014.78709: Calling all_inventory to load vars for managed_node2 18823 1726855014.78711: Calling groups_inventory to load vars for managed_node2 18823 1726855014.78714: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.78725: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.78727: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.78731: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.79097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.79576: done with get_vars() 18823 1726855014.79586: done getting variables 18823 1726855014.79630: WORKER PROCESS EXITING 18823 1726855014.79721: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:56:54 -0400 (0:00:00.036) 0:00:06.449 ****** 18823 1726855014.79752: entering _queue_task() for managed_node2/debug 18823 1726855014.80047: worker is 1 (out of 1 available) 18823 1726855014.80059: exiting _queue_task() for managed_node2/debug 18823 1726855014.80071: done queuing things up, now waiting for results queue to drain 18823 1726855014.80072: waiting for pending results... 18823 1726855014.80314: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 18823 1726855014.80417: in run() - task 0affcc66-ac2b-d391-077c-000000000105 18823 1726855014.80436: variable 'ansible_search_path' from source: unknown 18823 1726855014.80443: variable 'ansible_search_path' from source: unknown 18823 1726855014.80481: calling self._execute() 18823 1726855014.80568: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.80579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.80596: variable 'omit' from source: magic vars 18823 1726855014.80961: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.80979: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.80992: variable 'omit' from source: magic vars 18823 1726855014.81034: variable 'omit' from source: magic vars 18823 1726855014.81136: variable 'current_interfaces' from source: set_fact 18823 1726855014.81170: variable 'omit' from source: magic vars 18823 1726855014.81216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855014.81253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855014.81281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855014.81306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.81322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855014.81354: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855014.81361: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.81368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.81471: Set connection var ansible_timeout to 10 18823 1726855014.81483: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855014.81495: Set connection var ansible_shell_type to sh 18823 1726855014.81507: Set connection var ansible_shell_executable to /bin/sh 18823 1726855014.81517: Set connection var ansible_connection to ssh 18823 1726855014.81525: Set connection var ansible_pipelining to False 18823 1726855014.81552: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.81559: variable 'ansible_connection' from source: unknown 18823 1726855014.81565: variable 'ansible_module_compression' from source: unknown 18823 1726855014.81571: variable 'ansible_shell_type' from source: unknown 18823 1726855014.81579: variable 'ansible_shell_executable' from source: unknown 18823 1726855014.81626: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.81635: variable 'ansible_pipelining' from source: unknown 18823 1726855014.81642: variable 'ansible_timeout' from source: unknown 18823 1726855014.81649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.81789: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855014.81810: variable 'omit' from source: magic vars 18823 1726855014.81992: starting attempt loop 18823 1726855014.81996: running the handler 18823 1726855014.81998: handler run complete 18823 1726855014.82003: attempt loop complete, returning result 18823 1726855014.82005: _execute() done 18823 1726855014.82007: dumping result to json 18823 1726855014.82009: done dumping result, returning 18823 1726855014.82011: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcc66-ac2b-d391-077c-000000000105] 18823 1726855014.82013: sending task result for task 0affcc66-ac2b-d391-077c-000000000105 18823 1726855014.82075: done sending task result for task 0affcc66-ac2b-d391-077c-000000000105 18823 1726855014.82078: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18823 1726855014.82133: no more pending results, returning what we have 18823 1726855014.82136: results queue empty 18823 1726855014.82138: checking for any_errors_fatal 18823 1726855014.82144: done checking for any_errors_fatal 18823 1726855014.82145: checking for max_fail_percentage 18823 1726855014.82146: done checking for max_fail_percentage 18823 1726855014.82147: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.82148: done checking to see if all hosts have failed 18823 1726855014.82148: getting the remaining hosts for this loop 18823 1726855014.82150: done getting the remaining hosts for this loop 18823 1726855014.82154: getting the next task for host managed_node2 18823 1726855014.82163: done getting next task for host managed_node2 18823 1726855014.82165: ^ task is: TASK: Include the task 'manage_test_interface.yml' 18823 1726855014.82168: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.82171: getting variables 18823 1726855014.82173: in VariableManager get_vars() 18823 1726855014.82203: Calling all_inventory to load vars for managed_node2 18823 1726855014.82206: Calling groups_inventory to load vars for managed_node2 18823 1726855014.82210: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.82221: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.82225: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.82228: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.82586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.82789: done with get_vars() 18823 1726855014.82798: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 13:56:54 -0400 (0:00:00.031) 0:00:06.480 ****** 18823 1726855014.82882: entering _queue_task() for managed_node2/include_tasks 18823 1726855014.83224: worker is 1 (out of 1 available) 18823 1726855014.83236: exiting _queue_task() for managed_node2/include_tasks 18823 1726855014.83246: done queuing things up, now waiting for results queue to drain 18823 1726855014.83247: waiting for pending results... 18823 1726855014.83416: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 18823 1726855014.83540: in run() - task 0affcc66-ac2b-d391-077c-000000000011 18823 1726855014.83580: variable 'ansible_search_path' from source: unknown 18823 1726855014.83620: calling self._execute() 18823 1726855014.83700: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.83719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.83793: variable 'omit' from source: magic vars 18823 1726855014.84097: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.84121: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.84131: _execute() done 18823 1726855014.84139: dumping result to json 18823 1726855014.84146: done dumping result, returning 18823 1726855014.84156: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0affcc66-ac2b-d391-077c-000000000011] 18823 1726855014.84166: sending task result for task 0affcc66-ac2b-d391-077c-000000000011 18823 1726855014.84450: done sending task result for task 0affcc66-ac2b-d391-077c-000000000011 18823 1726855014.84453: WORKER PROCESS EXITING 18823 1726855014.84476: no more pending results, returning what we have 18823 1726855014.84481: in VariableManager get_vars() 18823 1726855014.84516: Calling all_inventory to load vars for managed_node2 18823 1726855014.84519: Calling groups_inventory to load vars for managed_node2 18823 1726855014.84522: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.84533: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.84536: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.84539: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.84851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.85050: done with get_vars() 18823 1726855014.85057: variable 'ansible_search_path' from source: unknown 18823 1726855014.85070: we have included files to process 18823 1726855014.85071: generating all_blocks data 18823 1726855014.85073: done generating all_blocks data 18823 1726855014.85077: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18823 1726855014.85079: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18823 1726855014.85081: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18823 1726855014.86234: in VariableManager get_vars() 18823 1726855014.86252: done with get_vars() 18823 1726855014.86664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 18823 1726855014.87905: done processing included file 18823 1726855014.87908: iterating over new_blocks loaded from include file 18823 1726855014.87910: in VariableManager get_vars() 18823 1726855014.87924: done with get_vars() 18823 1726855014.87925: filtering new block on tags 18823 1726855014.87957: done filtering new block on tags 18823 1726855014.87960: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 18823 1726855014.87965: extending task lists for all hosts with included blocks 18823 1726855014.88257: done extending task lists 18823 1726855014.88258: done processing included files 18823 1726855014.88259: results queue empty 18823 1726855014.88260: checking for any_errors_fatal 18823 1726855014.88263: done checking for any_errors_fatal 18823 1726855014.88264: checking for max_fail_percentage 18823 1726855014.88265: done checking for max_fail_percentage 18823 1726855014.88265: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.88266: done checking to see if all hosts have failed 18823 1726855014.88267: getting the remaining hosts for this loop 18823 1726855014.88268: done getting the remaining hosts for this loop 18823 1726855014.88271: getting the next task for host managed_node2 18823 1726855014.88275: done getting next task for host managed_node2 18823 1726855014.88276: ^ task is: TASK: Ensure state in ["present", "absent"] 18823 1726855014.88279: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.88281: getting variables 18823 1726855014.88282: in VariableManager get_vars() 18823 1726855014.88495: Calling all_inventory to load vars for managed_node2 18823 1726855014.88498: Calling groups_inventory to load vars for managed_node2 18823 1726855014.88504: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.88511: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.88513: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.88516: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.88671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.89086: done with get_vars() 18823 1726855014.89204: done getting variables 18823 1726855014.89278: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:56:54 -0400 (0:00:00.065) 0:00:06.545 ****** 18823 1726855014.89414: entering _queue_task() for managed_node2/fail 18823 1726855014.89415: Creating lock for fail 18823 1726855014.90084: worker is 1 (out of 1 available) 18823 1726855014.90096: exiting _queue_task() for managed_node2/fail 18823 1726855014.90107: done queuing things up, now waiting for results queue to drain 18823 1726855014.90108: waiting for pending results... 18823 1726855014.90373: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 18823 1726855014.90617: in run() - task 0affcc66-ac2b-d391-077c-000000000131 18823 1726855014.90743: variable 'ansible_search_path' from source: unknown 18823 1726855014.90748: variable 'ansible_search_path' from source: unknown 18823 1726855014.90767: calling self._execute() 18823 1726855014.90995: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.90998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.91289: variable 'omit' from source: magic vars 18823 1726855014.91886: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.91908: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.92051: variable 'state' from source: include params 18823 1726855014.92062: Evaluated conditional (state not in ["present", "absent"]): False 18823 1726855014.92069: when evaluation is False, skipping this task 18823 1726855014.92076: _execute() done 18823 1726855014.92082: dumping result to json 18823 1726855014.92093: done dumping result, returning 18823 1726855014.92103: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0affcc66-ac2b-d391-077c-000000000131] 18823 1726855014.92112: sending task result for task 0affcc66-ac2b-d391-077c-000000000131 skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 18823 1726855014.92259: no more pending results, returning what we have 18823 1726855014.92264: results queue empty 18823 1726855014.92265: checking for any_errors_fatal 18823 1726855014.92267: done checking for any_errors_fatal 18823 1726855014.92267: checking for max_fail_percentage 18823 1726855014.92269: done checking for max_fail_percentage 18823 1726855014.92270: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.92271: done checking to see if all hosts have failed 18823 1726855014.92272: getting the remaining hosts for this loop 18823 1726855014.92274: done getting the remaining hosts for this loop 18823 1726855014.92277: getting the next task for host managed_node2 18823 1726855014.92285: done getting next task for host managed_node2 18823 1726855014.92291: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 18823 1726855014.92295: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.92302: getting variables 18823 1726855014.92304: in VariableManager get_vars() 18823 1726855014.92333: Calling all_inventory to load vars for managed_node2 18823 1726855014.92336: Calling groups_inventory to load vars for managed_node2 18823 1726855014.92339: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.92353: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.92356: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.92358: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.92852: done sending task result for task 0affcc66-ac2b-d391-077c-000000000131 18823 1726855014.92855: WORKER PROCESS EXITING 18823 1726855014.92879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.93041: done with get_vars() 18823 1726855014.93050: done getting variables 18823 1726855014.93107: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:56:54 -0400 (0:00:00.037) 0:00:06.583 ****** 18823 1726855014.93132: entering _queue_task() for managed_node2/fail 18823 1726855014.93454: worker is 1 (out of 1 available) 18823 1726855014.93467: exiting _queue_task() for managed_node2/fail 18823 1726855014.93478: done queuing things up, now waiting for results queue to drain 18823 1726855014.93479: waiting for pending results... 18823 1726855014.93749: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 18823 1726855014.93859: in run() - task 0affcc66-ac2b-d391-077c-000000000132 18823 1726855014.93878: variable 'ansible_search_path' from source: unknown 18823 1726855014.93886: variable 'ansible_search_path' from source: unknown 18823 1726855014.93934: calling self._execute() 18823 1726855014.94023: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.94034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.94054: variable 'omit' from source: magic vars 18823 1726855014.94427: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.94445: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.94596: variable 'type' from source: set_fact 18823 1726855014.94611: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 18823 1726855014.94619: when evaluation is False, skipping this task 18823 1726855014.94626: _execute() done 18823 1726855014.94633: dumping result to json 18823 1726855014.94640: done dumping result, returning 18823 1726855014.94793: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcc66-ac2b-d391-077c-000000000132] 18823 1726855014.94796: sending task result for task 0affcc66-ac2b-d391-077c-000000000132 18823 1726855014.94862: done sending task result for task 0affcc66-ac2b-d391-077c-000000000132 18823 1726855014.94865: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 18823 1726855014.94919: no more pending results, returning what we have 18823 1726855014.94923: results queue empty 18823 1726855014.94924: checking for any_errors_fatal 18823 1726855014.94932: done checking for any_errors_fatal 18823 1726855014.94933: checking for max_fail_percentage 18823 1726855014.94935: done checking for max_fail_percentage 18823 1726855014.94936: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.94937: done checking to see if all hosts have failed 18823 1726855014.94938: getting the remaining hosts for this loop 18823 1726855014.94939: done getting the remaining hosts for this loop 18823 1726855014.94943: getting the next task for host managed_node2 18823 1726855014.94951: done getting next task for host managed_node2 18823 1726855014.94954: ^ task is: TASK: Include the task 'show_interfaces.yml' 18823 1726855014.94957: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.94962: getting variables 18823 1726855014.94964: in VariableManager get_vars() 18823 1726855014.94993: Calling all_inventory to load vars for managed_node2 18823 1726855014.94997: Calling groups_inventory to load vars for managed_node2 18823 1726855014.95000: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.95017: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.95021: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.95025: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.95301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.95512: done with get_vars() 18823 1726855014.95521: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:56:54 -0400 (0:00:00.024) 0:00:06.607 ****** 18823 1726855014.95606: entering _queue_task() for managed_node2/include_tasks 18823 1726855014.95819: worker is 1 (out of 1 available) 18823 1726855014.95830: exiting _queue_task() for managed_node2/include_tasks 18823 1726855014.95840: done queuing things up, now waiting for results queue to drain 18823 1726855014.95841: waiting for pending results... 18823 1726855014.96069: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 18823 1726855014.96275: in run() - task 0affcc66-ac2b-d391-077c-000000000133 18823 1726855014.96280: variable 'ansible_search_path' from source: unknown 18823 1726855014.96283: variable 'ansible_search_path' from source: unknown 18823 1726855014.96286: calling self._execute() 18823 1726855014.96306: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.96316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.96330: variable 'omit' from source: magic vars 18823 1726855014.96673: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.96694: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.96709: _execute() done 18823 1726855014.96718: dumping result to json 18823 1726855014.96729: done dumping result, returning 18823 1726855014.96740: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-d391-077c-000000000133] 18823 1726855014.96748: sending task result for task 0affcc66-ac2b-d391-077c-000000000133 18823 1726855014.96865: no more pending results, returning what we have 18823 1726855014.96870: in VariableManager get_vars() 18823 1726855014.96907: Calling all_inventory to load vars for managed_node2 18823 1726855014.96910: Calling groups_inventory to load vars for managed_node2 18823 1726855014.96913: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.96926: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.96929: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.96932: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.97309: done sending task result for task 0affcc66-ac2b-d391-077c-000000000133 18823 1726855014.97313: WORKER PROCESS EXITING 18823 1726855014.97333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.97532: done with get_vars() 18823 1726855014.97539: variable 'ansible_search_path' from source: unknown 18823 1726855014.97540: variable 'ansible_search_path' from source: unknown 18823 1726855014.97573: we have included files to process 18823 1726855014.97575: generating all_blocks data 18823 1726855014.97576: done generating all_blocks data 18823 1726855014.97581: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18823 1726855014.97582: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18823 1726855014.97584: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18823 1726855014.97682: in VariableManager get_vars() 18823 1726855014.97701: done with get_vars() 18823 1726855014.97809: done processing included file 18823 1726855014.97811: iterating over new_blocks loaded from include file 18823 1726855014.97812: in VariableManager get_vars() 18823 1726855014.97825: done with get_vars() 18823 1726855014.97826: filtering new block on tags 18823 1726855014.97842: done filtering new block on tags 18823 1726855014.97844: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 18823 1726855014.97848: extending task lists for all hosts with included blocks 18823 1726855014.98247: done extending task lists 18823 1726855014.98248: done processing included files 18823 1726855014.98249: results queue empty 18823 1726855014.98250: checking for any_errors_fatal 18823 1726855014.98252: done checking for any_errors_fatal 18823 1726855014.98252: checking for max_fail_percentage 18823 1726855014.98253: done checking for max_fail_percentage 18823 1726855014.98254: checking to see if all hosts have failed and the running result is not ok 18823 1726855014.98255: done checking to see if all hosts have failed 18823 1726855014.98255: getting the remaining hosts for this loop 18823 1726855014.98257: done getting the remaining hosts for this loop 18823 1726855014.98259: getting the next task for host managed_node2 18823 1726855014.98263: done getting next task for host managed_node2 18823 1726855014.98265: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18823 1726855014.98268: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855014.98270: getting variables 18823 1726855014.98271: in VariableManager get_vars() 18823 1726855014.98279: Calling all_inventory to load vars for managed_node2 18823 1726855014.98281: Calling groups_inventory to load vars for managed_node2 18823 1726855014.98283: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855014.98289: Calling all_plugins_play to load vars for managed_node2 18823 1726855014.98292: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855014.98295: Calling groups_plugins_play to load vars for managed_node2 18823 1726855014.98460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855014.98652: done with get_vars() 18823 1726855014.98661: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:56:54 -0400 (0:00:00.031) 0:00:06.639 ****** 18823 1726855014.98738: entering _queue_task() for managed_node2/include_tasks 18823 1726855014.98975: worker is 1 (out of 1 available) 18823 1726855014.98986: exiting _queue_task() for managed_node2/include_tasks 18823 1726855014.99000: done queuing things up, now waiting for results queue to drain 18823 1726855014.99001: waiting for pending results... 18823 1726855014.99235: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 18823 1726855014.99342: in run() - task 0affcc66-ac2b-d391-077c-00000000015c 18823 1726855014.99362: variable 'ansible_search_path' from source: unknown 18823 1726855014.99394: variable 'ansible_search_path' from source: unknown 18823 1726855014.99416: calling self._execute() 18823 1726855014.99492: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855014.99508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855014.99592: variable 'omit' from source: magic vars 18823 1726855014.99876: variable 'ansible_distribution_major_version' from source: facts 18823 1726855014.99896: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855014.99909: _execute() done 18823 1726855014.99918: dumping result to json 18823 1726855014.99931: done dumping result, returning 18823 1726855014.99942: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-d391-077c-00000000015c] 18823 1726855014.99951: sending task result for task 0affcc66-ac2b-d391-077c-00000000015c 18823 1726855015.00066: no more pending results, returning what we have 18823 1726855015.00071: in VariableManager get_vars() 18823 1726855015.00107: Calling all_inventory to load vars for managed_node2 18823 1726855015.00110: Calling groups_inventory to load vars for managed_node2 18823 1726855015.00114: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855015.00127: Calling all_plugins_play to load vars for managed_node2 18823 1726855015.00130: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855015.00134: Calling groups_plugins_play to load vars for managed_node2 18823 1726855015.00497: done sending task result for task 0affcc66-ac2b-d391-077c-00000000015c 18823 1726855015.00500: WORKER PROCESS EXITING 18823 1726855015.00524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855015.00734: done with get_vars() 18823 1726855015.00740: variable 'ansible_search_path' from source: unknown 18823 1726855015.00741: variable 'ansible_search_path' from source: unknown 18823 1726855015.00796: we have included files to process 18823 1726855015.00797: generating all_blocks data 18823 1726855015.00799: done generating all_blocks data 18823 1726855015.00800: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18823 1726855015.00801: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18823 1726855015.00806: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18823 1726855015.01047: done processing included file 18823 1726855015.01049: iterating over new_blocks loaded from include file 18823 1726855015.01050: in VariableManager get_vars() 18823 1726855015.01062: done with get_vars() 18823 1726855015.01064: filtering new block on tags 18823 1726855015.01079: done filtering new block on tags 18823 1726855015.01081: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 18823 1726855015.01085: extending task lists for all hosts with included blocks 18823 1726855015.01231: done extending task lists 18823 1726855015.01232: done processing included files 18823 1726855015.01233: results queue empty 18823 1726855015.01233: checking for any_errors_fatal 18823 1726855015.01236: done checking for any_errors_fatal 18823 1726855015.01237: checking for max_fail_percentage 18823 1726855015.01238: done checking for max_fail_percentage 18823 1726855015.01238: checking to see if all hosts have failed and the running result is not ok 18823 1726855015.01239: done checking to see if all hosts have failed 18823 1726855015.01240: getting the remaining hosts for this loop 18823 1726855015.01241: done getting the remaining hosts for this loop 18823 1726855015.01243: getting the next task for host managed_node2 18823 1726855015.01248: done getting next task for host managed_node2 18823 1726855015.01249: ^ task is: TASK: Gather current interface info 18823 1726855015.01253: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855015.01255: getting variables 18823 1726855015.01255: in VariableManager get_vars() 18823 1726855015.01262: Calling all_inventory to load vars for managed_node2 18823 1726855015.01264: Calling groups_inventory to load vars for managed_node2 18823 1726855015.01267: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855015.01271: Calling all_plugins_play to load vars for managed_node2 18823 1726855015.01273: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855015.01275: Calling groups_plugins_play to load vars for managed_node2 18823 1726855015.01417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855015.01611: done with get_vars() 18823 1726855015.01620: done getting variables 18823 1726855015.01655: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:56:55 -0400 (0:00:00.029) 0:00:06.668 ****** 18823 1726855015.01682: entering _queue_task() for managed_node2/command 18823 1726855015.01907: worker is 1 (out of 1 available) 18823 1726855015.01918: exiting _queue_task() for managed_node2/command 18823 1726855015.01929: done queuing things up, now waiting for results queue to drain 18823 1726855015.01930: waiting for pending results... 18823 1726855015.02154: running TaskExecutor() for managed_node2/TASK: Gather current interface info 18823 1726855015.02267: in run() - task 0affcc66-ac2b-d391-077c-000000000193 18823 1726855015.02493: variable 'ansible_search_path' from source: unknown 18823 1726855015.02497: variable 'ansible_search_path' from source: unknown 18823 1726855015.02499: calling self._execute() 18823 1726855015.02504: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.02507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.02510: variable 'omit' from source: magic vars 18823 1726855015.02832: variable 'ansible_distribution_major_version' from source: facts 18823 1726855015.02853: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855015.02864: variable 'omit' from source: magic vars 18823 1726855015.02921: variable 'omit' from source: magic vars 18823 1726855015.02963: variable 'omit' from source: magic vars 18823 1726855015.03009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855015.03047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855015.03075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855015.03099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.03164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.03167: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855015.03170: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.03172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.03264: Set connection var ansible_timeout to 10 18823 1726855015.03283: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855015.03293: Set connection var ansible_shell_type to sh 18823 1726855015.03307: Set connection var ansible_shell_executable to /bin/sh 18823 1726855015.03318: Set connection var ansible_connection to ssh 18823 1726855015.03328: Set connection var ansible_pipelining to False 18823 1726855015.03382: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.03385: variable 'ansible_connection' from source: unknown 18823 1726855015.03389: variable 'ansible_module_compression' from source: unknown 18823 1726855015.03392: variable 'ansible_shell_type' from source: unknown 18823 1726855015.03394: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.03396: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.03398: variable 'ansible_pipelining' from source: unknown 18823 1726855015.03400: variable 'ansible_timeout' from source: unknown 18823 1726855015.03405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.03594: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855015.03598: variable 'omit' from source: magic vars 18823 1726855015.03600: starting attempt loop 18823 1726855015.03604: running the handler 18823 1726855015.03606: _low_level_execute_command(): starting 18823 1726855015.03608: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855015.04337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855015.04356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855015.04374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.04476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.04500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.04520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.04546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.04658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.06411: stdout chunk (state=3): >>>/root <<< 18823 1726855015.06496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.06522: stderr chunk (state=3): >>><<< 18823 1726855015.06525: stdout chunk (state=3): >>><<< 18823 1726855015.06552: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855015.06561: _low_level_execute_command(): starting 18823 1726855015.06568: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048 `" && echo ansible-tmp-1726855015.0654888-19149-146721350411048="` echo /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048 `" ) && sleep 0' 18823 1726855015.06984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.07006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855015.07010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.07031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.07078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.07081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.07090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.07154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.09081: stdout chunk (state=3): >>>ansible-tmp-1726855015.0654888-19149-146721350411048=/root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048 <<< 18823 1726855015.09196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.09224: stderr chunk (state=3): >>><<< 18823 1726855015.09227: stdout chunk (state=3): >>><<< 18823 1726855015.09238: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855015.0654888-19149-146721350411048=/root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855015.09293: variable 'ansible_module_compression' from source: unknown 18823 1726855015.09311: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855015.09340: variable 'ansible_facts' from source: unknown 18823 1726855015.09394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/AnsiballZ_command.py 18823 1726855015.09495: Sending initial data 18823 1726855015.09498: Sent initial data (156 bytes) 18823 1726855015.09925: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855015.09929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.09932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.09935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.09986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.09992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.10071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.11641: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855015.11738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855015.11807: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpys4jjic9 /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/AnsiballZ_command.py <<< 18823 1726855015.11810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/AnsiballZ_command.py" <<< 18823 1726855015.11874: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpys4jjic9" to remote "/root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/AnsiballZ_command.py" <<< 18823 1726855015.11877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/AnsiballZ_command.py" <<< 18823 1726855015.12527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.12566: stderr chunk (state=3): >>><<< 18823 1726855015.12569: stdout chunk (state=3): >>><<< 18823 1726855015.12575: done transferring module to remote 18823 1726855015.12591: _low_level_execute_command(): starting 18823 1726855015.12593: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/ /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/AnsiballZ_command.py && sleep 0' 18823 1726855015.12972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.13005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855015.13008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.13010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855015.13013: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.13020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.13062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.13065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.13138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.14977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.14980: stdout chunk (state=3): >>><<< 18823 1726855015.14982: stderr chunk (state=3): >>><<< 18823 1726855015.15007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855015.15078: _low_level_execute_command(): starting 18823 1726855015.15081: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/AnsiballZ_command.py && sleep 0' 18823 1726855015.15558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.15564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855015.15570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.15627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.15630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.15661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.15664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.15744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.31185: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:56:55.306246", "end": "2024-09-20 13:56:55.309596", "delta": "0:00:00.003350", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855015.32747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855015.32751: stdout chunk (state=3): >>><<< 18823 1726855015.32754: stderr chunk (state=3): >>><<< 18823 1726855015.32895: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:56:55.306246", "end": "2024-09-20 13:56:55.309596", "delta": "0:00:00.003350", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855015.32900: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855015.32906: _low_level_execute_command(): starting 18823 1726855015.32908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855015.0654888-19149-146721350411048/ > /dev/null 2>&1 && sleep 0' 18823 1726855015.33571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855015.33595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855015.33619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.33638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855015.33707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.33764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.33783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.33822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.33935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.35868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.35871: stdout chunk (state=3): >>><<< 18823 1726855015.35874: stderr chunk (state=3): >>><<< 18823 1726855015.35895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855015.35993: handler run complete 18823 1726855015.35996: Evaluated conditional (False): False 18823 1726855015.35999: attempt loop complete, returning result 18823 1726855015.36001: _execute() done 18823 1726855015.36005: dumping result to json 18823 1726855015.36007: done dumping result, returning 18823 1726855015.36010: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcc66-ac2b-d391-077c-000000000193] 18823 1726855015.36012: sending task result for task 0affcc66-ac2b-d391-077c-000000000193 ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003350", "end": "2024-09-20 13:56:55.309596", "rc": 0, "start": "2024-09-20 13:56:55.306246" } STDOUT: bonding_masters eth0 lo 18823 1726855015.36310: no more pending results, returning what we have 18823 1726855015.36314: results queue empty 18823 1726855015.36315: checking for any_errors_fatal 18823 1726855015.36317: done checking for any_errors_fatal 18823 1726855015.36318: checking for max_fail_percentage 18823 1726855015.36320: done checking for max_fail_percentage 18823 1726855015.36321: checking to see if all hosts have failed and the running result is not ok 18823 1726855015.36322: done checking to see if all hosts have failed 18823 1726855015.36322: getting the remaining hosts for this loop 18823 1726855015.36324: done getting the remaining hosts for this loop 18823 1726855015.36328: getting the next task for host managed_node2 18823 1726855015.36336: done getting next task for host managed_node2 18823 1726855015.36339: ^ task is: TASK: Set current_interfaces 18823 1726855015.36343: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855015.36347: getting variables 18823 1726855015.36349: in VariableManager get_vars() 18823 1726855015.36550: Calling all_inventory to load vars for managed_node2 18823 1726855015.36553: Calling groups_inventory to load vars for managed_node2 18823 1726855015.36557: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855015.36615: done sending task result for task 0affcc66-ac2b-d391-077c-000000000193 18823 1726855015.36618: WORKER PROCESS EXITING 18823 1726855015.36628: Calling all_plugins_play to load vars for managed_node2 18823 1726855015.36632: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855015.36636: Calling groups_plugins_play to load vars for managed_node2 18823 1726855015.36914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855015.37137: done with get_vars() 18823 1726855015.37163: done getting variables 18823 1726855015.37236: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:56:55 -0400 (0:00:00.355) 0:00:07.024 ****** 18823 1726855015.37272: entering _queue_task() for managed_node2/set_fact 18823 1726855015.37724: worker is 1 (out of 1 available) 18823 1726855015.37733: exiting _queue_task() for managed_node2/set_fact 18823 1726855015.37742: done queuing things up, now waiting for results queue to drain 18823 1726855015.37743: waiting for pending results... 18823 1726855015.37899: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 18823 1726855015.38036: in run() - task 0affcc66-ac2b-d391-077c-000000000194 18823 1726855015.38052: variable 'ansible_search_path' from source: unknown 18823 1726855015.38058: variable 'ansible_search_path' from source: unknown 18823 1726855015.38101: calling self._execute() 18823 1726855015.38193: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.38207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.38224: variable 'omit' from source: magic vars 18823 1726855015.38610: variable 'ansible_distribution_major_version' from source: facts 18823 1726855015.38628: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855015.38642: variable 'omit' from source: magic vars 18823 1726855015.38707: variable 'omit' from source: magic vars 18823 1726855015.38904: variable '_current_interfaces' from source: set_fact 18823 1726855015.38907: variable 'omit' from source: magic vars 18823 1726855015.38932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855015.38969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855015.38997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855015.39030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.39047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.39082: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855015.39095: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.39120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.39228: Set connection var ansible_timeout to 10 18823 1726855015.39337: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855015.39342: Set connection var ansible_shell_type to sh 18823 1726855015.39345: Set connection var ansible_shell_executable to /bin/sh 18823 1726855015.39347: Set connection var ansible_connection to ssh 18823 1726855015.39349: Set connection var ansible_pipelining to False 18823 1726855015.39351: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.39354: variable 'ansible_connection' from source: unknown 18823 1726855015.39356: variable 'ansible_module_compression' from source: unknown 18823 1726855015.39358: variable 'ansible_shell_type' from source: unknown 18823 1726855015.39360: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.39362: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.39364: variable 'ansible_pipelining' from source: unknown 18823 1726855015.39365: variable 'ansible_timeout' from source: unknown 18823 1726855015.39367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.39495: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855015.39513: variable 'omit' from source: magic vars 18823 1726855015.39523: starting attempt loop 18823 1726855015.39529: running the handler 18823 1726855015.39553: handler run complete 18823 1726855015.39591: attempt loop complete, returning result 18823 1726855015.39594: _execute() done 18823 1726855015.39596: dumping result to json 18823 1726855015.39598: done dumping result, returning 18823 1726855015.39601: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcc66-ac2b-d391-077c-000000000194] 18823 1726855015.39605: sending task result for task 0affcc66-ac2b-d391-077c-000000000194 18823 1726855015.39926: done sending task result for task 0affcc66-ac2b-d391-077c-000000000194 18823 1726855015.39930: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18823 1726855015.39984: no more pending results, returning what we have 18823 1726855015.39988: results queue empty 18823 1726855015.39989: checking for any_errors_fatal 18823 1726855015.39995: done checking for any_errors_fatal 18823 1726855015.39996: checking for max_fail_percentage 18823 1726855015.39998: done checking for max_fail_percentage 18823 1726855015.39999: checking to see if all hosts have failed and the running result is not ok 18823 1726855015.40000: done checking to see if all hosts have failed 18823 1726855015.40000: getting the remaining hosts for this loop 18823 1726855015.40004: done getting the remaining hosts for this loop 18823 1726855015.40008: getting the next task for host managed_node2 18823 1726855015.40014: done getting next task for host managed_node2 18823 1726855015.40017: ^ task is: TASK: Show current_interfaces 18823 1726855015.40020: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855015.40023: getting variables 18823 1726855015.40025: in VariableManager get_vars() 18823 1726855015.40054: Calling all_inventory to load vars for managed_node2 18823 1726855015.40057: Calling groups_inventory to load vars for managed_node2 18823 1726855015.40060: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855015.40069: Calling all_plugins_play to load vars for managed_node2 18823 1726855015.40072: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855015.40075: Calling groups_plugins_play to load vars for managed_node2 18823 1726855015.40328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855015.40567: done with get_vars() 18823 1726855015.40577: done getting variables 18823 1726855015.40640: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:56:55 -0400 (0:00:00.033) 0:00:07.058 ****** 18823 1726855015.40671: entering _queue_task() for managed_node2/debug 18823 1726855015.40960: worker is 1 (out of 1 available) 18823 1726855015.40971: exiting _queue_task() for managed_node2/debug 18823 1726855015.40981: done queuing things up, now waiting for results queue to drain 18823 1726855015.40982: waiting for pending results... 18823 1726855015.41243: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 18823 1726855015.41359: in run() - task 0affcc66-ac2b-d391-077c-00000000015d 18823 1726855015.41377: variable 'ansible_search_path' from source: unknown 18823 1726855015.41385: variable 'ansible_search_path' from source: unknown 18823 1726855015.41455: calling self._execute() 18823 1726855015.41521: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.41532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.41545: variable 'omit' from source: magic vars 18823 1726855015.41954: variable 'ansible_distribution_major_version' from source: facts 18823 1726855015.41963: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855015.41995: variable 'omit' from source: magic vars 18823 1726855015.42034: variable 'omit' from source: magic vars 18823 1726855015.42143: variable 'current_interfaces' from source: set_fact 18823 1726855015.42212: variable 'omit' from source: magic vars 18823 1726855015.42231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855015.42270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855015.42305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855015.42332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.42429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.42432: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855015.42434: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.42437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.42505: Set connection var ansible_timeout to 10 18823 1726855015.42517: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855015.42523: Set connection var ansible_shell_type to sh 18823 1726855015.42536: Set connection var ansible_shell_executable to /bin/sh 18823 1726855015.42549: Set connection var ansible_connection to ssh 18823 1726855015.42558: Set connection var ansible_pipelining to False 18823 1726855015.42588: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.42597: variable 'ansible_connection' from source: unknown 18823 1726855015.42607: variable 'ansible_module_compression' from source: unknown 18823 1726855015.42614: variable 'ansible_shell_type' from source: unknown 18823 1726855015.42620: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.42644: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.42646: variable 'ansible_pipelining' from source: unknown 18823 1726855015.42648: variable 'ansible_timeout' from source: unknown 18823 1726855015.42650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.42798: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855015.42817: variable 'omit' from source: magic vars 18823 1726855015.42826: starting attempt loop 18823 1726855015.42832: running the handler 18823 1726855015.42884: handler run complete 18823 1726855015.42906: attempt loop complete, returning result 18823 1726855015.42913: _execute() done 18823 1726855015.42920: dumping result to json 18823 1726855015.42926: done dumping result, returning 18823 1726855015.42936: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcc66-ac2b-d391-077c-00000000015d] 18823 1726855015.42945: sending task result for task 0affcc66-ac2b-d391-077c-00000000015d ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18823 1726855015.43126: no more pending results, returning what we have 18823 1726855015.43130: results queue empty 18823 1726855015.43131: checking for any_errors_fatal 18823 1726855015.43137: done checking for any_errors_fatal 18823 1726855015.43137: checking for max_fail_percentage 18823 1726855015.43140: done checking for max_fail_percentage 18823 1726855015.43141: checking to see if all hosts have failed and the running result is not ok 18823 1726855015.43142: done checking to see if all hosts have failed 18823 1726855015.43142: getting the remaining hosts for this loop 18823 1726855015.43144: done getting the remaining hosts for this loop 18823 1726855015.43148: getting the next task for host managed_node2 18823 1726855015.43157: done getting next task for host managed_node2 18823 1726855015.43160: ^ task is: TASK: Install iproute 18823 1726855015.43163: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855015.43168: getting variables 18823 1726855015.43170: in VariableManager get_vars() 18823 1726855015.43201: Calling all_inventory to load vars for managed_node2 18823 1726855015.43207: Calling groups_inventory to load vars for managed_node2 18823 1726855015.43210: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855015.43222: Calling all_plugins_play to load vars for managed_node2 18823 1726855015.43225: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855015.43228: Calling groups_plugins_play to load vars for managed_node2 18823 1726855015.43658: done sending task result for task 0affcc66-ac2b-d391-077c-00000000015d 18823 1726855015.43661: WORKER PROCESS EXITING 18823 1726855015.43683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855015.43894: done with get_vars() 18823 1726855015.43907: done getting variables 18823 1726855015.43971: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:56:55 -0400 (0:00:00.033) 0:00:07.091 ****** 18823 1726855015.44001: entering _queue_task() for managed_node2/package 18823 1726855015.44380: worker is 1 (out of 1 available) 18823 1726855015.44394: exiting _queue_task() for managed_node2/package 18823 1726855015.44406: done queuing things up, now waiting for results queue to drain 18823 1726855015.44407: waiting for pending results... 18823 1726855015.44547: running TaskExecutor() for managed_node2/TASK: Install iproute 18823 1726855015.44654: in run() - task 0affcc66-ac2b-d391-077c-000000000134 18823 1726855015.44672: variable 'ansible_search_path' from source: unknown 18823 1726855015.44679: variable 'ansible_search_path' from source: unknown 18823 1726855015.44727: calling self._execute() 18823 1726855015.44812: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.44824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.44840: variable 'omit' from source: magic vars 18823 1726855015.45223: variable 'ansible_distribution_major_version' from source: facts 18823 1726855015.45238: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855015.45251: variable 'omit' from source: magic vars 18823 1726855015.45290: variable 'omit' from source: magic vars 18823 1726855015.45678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855015.50038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855015.50108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855015.50141: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855015.50174: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855015.50221: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855015.50458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855015.50462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855015.50465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855015.50467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855015.50469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855015.50532: variable '__network_is_ostree' from source: set_fact 18823 1726855015.50537: variable 'omit' from source: magic vars 18823 1726855015.50569: variable 'omit' from source: magic vars 18823 1726855015.50599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855015.50631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855015.50648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855015.50668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.50676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855015.50710: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855015.50713: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.50722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.50895: Set connection var ansible_timeout to 10 18823 1726855015.50899: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855015.50901: Set connection var ansible_shell_type to sh 18823 1726855015.50907: Set connection var ansible_shell_executable to /bin/sh 18823 1726855015.50910: Set connection var ansible_connection to ssh 18823 1726855015.50912: Set connection var ansible_pipelining to False 18823 1726855015.50914: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.50916: variable 'ansible_connection' from source: unknown 18823 1726855015.50918: variable 'ansible_module_compression' from source: unknown 18823 1726855015.50920: variable 'ansible_shell_type' from source: unknown 18823 1726855015.50922: variable 'ansible_shell_executable' from source: unknown 18823 1726855015.50923: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855015.50925: variable 'ansible_pipelining' from source: unknown 18823 1726855015.50927: variable 'ansible_timeout' from source: unknown 18823 1726855015.50929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855015.51072: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855015.51114: variable 'omit' from source: magic vars 18823 1726855015.51117: starting attempt loop 18823 1726855015.51119: running the handler 18823 1726855015.51125: variable 'ansible_facts' from source: unknown 18823 1726855015.51127: variable 'ansible_facts' from source: unknown 18823 1726855015.51171: _low_level_execute_command(): starting 18823 1726855015.51178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855015.52001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855015.52013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855015.52024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.52038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855015.52050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855015.52061: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855015.52148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.52180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.52263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.53975: stdout chunk (state=3): >>>/root <<< 18823 1726855015.54080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.54117: stderr chunk (state=3): >>><<< 18823 1726855015.54120: stdout chunk (state=3): >>><<< 18823 1726855015.54161: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855015.54164: _low_level_execute_command(): starting 18823 1726855015.54167: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222 `" && echo ansible-tmp-1726855015.5414097-19172-433115675222="` echo /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222 `" ) && sleep 0' 18823 1726855015.54809: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.54834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.54848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.54869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.54972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.56878: stdout chunk (state=3): >>>ansible-tmp-1726855015.5414097-19172-433115675222=/root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222 <<< 18823 1726855015.56980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.57013: stderr chunk (state=3): >>><<< 18823 1726855015.57017: stdout chunk (state=3): >>><<< 18823 1726855015.57094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855015.5414097-19172-433115675222=/root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855015.57097: variable 'ansible_module_compression' from source: unknown 18823 1726855015.57112: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 18823 1726855015.57116: ANSIBALLZ: Acquiring lock 18823 1726855015.57118: ANSIBALLZ: Lock acquired: 140142269228544 18823 1726855015.57121: ANSIBALLZ: Creating module 18823 1726855015.69944: ANSIBALLZ: Writing module into payload 18823 1726855015.70085: ANSIBALLZ: Writing module 18823 1726855015.70101: ANSIBALLZ: Renaming module 18823 1726855015.70116: ANSIBALLZ: Done creating module 18823 1726855015.70131: variable 'ansible_facts' from source: unknown 18823 1726855015.70203: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/AnsiballZ_dnf.py 18823 1726855015.70310: Sending initial data 18823 1726855015.70314: Sent initial data (149 bytes) 18823 1726855015.70749: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.70783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855015.70786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.70791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855015.70794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.70796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.70845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.70849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.70851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.70925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.72510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855015.72573: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855015.72643: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp67ghdmoy /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/AnsiballZ_dnf.py <<< 18823 1726855015.72646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/AnsiballZ_dnf.py" <<< 18823 1726855015.72718: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp67ghdmoy" to remote "/root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/AnsiballZ_dnf.py" <<< 18823 1726855015.72721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/AnsiballZ_dnf.py" <<< 18823 1726855015.73482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.73529: stderr chunk (state=3): >>><<< 18823 1726855015.73532: stdout chunk (state=3): >>><<< 18823 1726855015.73554: done transferring module to remote 18823 1726855015.73563: _low_level_execute_command(): starting 18823 1726855015.73568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/ /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/AnsiballZ_dnf.py && sleep 0' 18823 1726855015.74015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855015.74019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855015.74021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.74025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.74027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.74072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.74075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.74151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855015.75890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855015.75916: stderr chunk (state=3): >>><<< 18823 1726855015.75920: stdout chunk (state=3): >>><<< 18823 1726855015.75934: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855015.75937: _low_level_execute_command(): starting 18823 1726855015.75941: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/AnsiballZ_dnf.py && sleep 0' 18823 1726855015.76375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.76378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855015.76381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855015.76383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855015.76385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855015.76440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855015.76447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855015.76449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855015.76522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.17196: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 18823 1726855016.21145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.21192: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 18823 1726855016.21214: stdout chunk (state=3): >>><<< 18823 1726855016.21241: stderr chunk (state=3): >>><<< 18823 1726855016.21263: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855016.21393: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855016.21397: _low_level_execute_command(): starting 18823 1726855016.21399: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855015.5414097-19172-433115675222/ > /dev/null 2>&1 && sleep 0' 18823 1726855016.21964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855016.21977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855016.21993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.22069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.22116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.22141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.22155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.22265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.24595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.24599: stdout chunk (state=3): >>><<< 18823 1726855016.24601: stderr chunk (state=3): >>><<< 18823 1726855016.24605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.24608: handler run complete 18823 1726855016.24609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855016.25071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855016.25122: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855016.25166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855016.25209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855016.25298: variable '__install_status' from source: unknown 18823 1726855016.25327: Evaluated conditional (__install_status is success): True 18823 1726855016.25348: attempt loop complete, returning result 18823 1726855016.25355: _execute() done 18823 1726855016.25363: dumping result to json 18823 1726855016.25381: done dumping result, returning 18823 1726855016.25396: done running TaskExecutor() for managed_node2/TASK: Install iproute [0affcc66-ac2b-d391-077c-000000000134] 18823 1726855016.25408: sending task result for task 0affcc66-ac2b-d391-077c-000000000134 ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 18823 1726855016.25685: no more pending results, returning what we have 18823 1726855016.25692: results queue empty 18823 1726855016.25693: checking for any_errors_fatal 18823 1726855016.25704: done checking for any_errors_fatal 18823 1726855016.25705: checking for max_fail_percentage 18823 1726855016.25707: done checking for max_fail_percentage 18823 1726855016.25708: checking to see if all hosts have failed and the running result is not ok 18823 1726855016.25709: done checking to see if all hosts have failed 18823 1726855016.25710: getting the remaining hosts for this loop 18823 1726855016.25712: done getting the remaining hosts for this loop 18823 1726855016.25716: getting the next task for host managed_node2 18823 1726855016.25723: done getting next task for host managed_node2 18823 1726855016.25726: ^ task is: TASK: Create veth interface {{ interface }} 18823 1726855016.25729: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855016.25733: getting variables 18823 1726855016.25735: in VariableManager get_vars() 18823 1726855016.25768: Calling all_inventory to load vars for managed_node2 18823 1726855016.25771: Calling groups_inventory to load vars for managed_node2 18823 1726855016.25775: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855016.25920: Calling all_plugins_play to load vars for managed_node2 18823 1726855016.25926: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855016.25932: done sending task result for task 0affcc66-ac2b-d391-077c-000000000134 18823 1726855016.25934: WORKER PROCESS EXITING 18823 1726855016.25938: Calling groups_plugins_play to load vars for managed_node2 18823 1726855016.26669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855016.26938: done with get_vars() 18823 1726855016.26949: done getting variables 18823 1726855016.27016: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855016.27398: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:56:56 -0400 (0:00:00.834) 0:00:07.925 ****** 18823 1726855016.27429: entering _queue_task() for managed_node2/command 18823 1726855016.28075: worker is 1 (out of 1 available) 18823 1726855016.28138: exiting _queue_task() for managed_node2/command 18823 1726855016.28149: done queuing things up, now waiting for results queue to drain 18823 1726855016.28150: waiting for pending results... 18823 1726855016.28583: running TaskExecutor() for managed_node2/TASK: Create veth interface lsr27 18823 1726855016.28712: in run() - task 0affcc66-ac2b-d391-077c-000000000135 18823 1726855016.28734: variable 'ansible_search_path' from source: unknown 18823 1726855016.28741: variable 'ansible_search_path' from source: unknown 18823 1726855016.29040: variable 'interface' from source: set_fact 18823 1726855016.29132: variable 'interface' from source: set_fact 18823 1726855016.29211: variable 'interface' from source: set_fact 18823 1726855016.29592: Loaded config def from plugin (lookup/items) 18823 1726855016.29596: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 18823 1726855016.29599: variable 'omit' from source: magic vars 18823 1726855016.29601: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855016.29603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855016.29605: variable 'omit' from source: magic vars 18823 1726855016.29755: variable 'ansible_distribution_major_version' from source: facts 18823 1726855016.29768: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855016.29950: variable 'type' from source: set_fact 18823 1726855016.29960: variable 'state' from source: include params 18823 1726855016.29968: variable 'interface' from source: set_fact 18823 1726855016.29977: variable 'current_interfaces' from source: set_fact 18823 1726855016.29991: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18823 1726855016.30001: variable 'omit' from source: magic vars 18823 1726855016.30039: variable 'omit' from source: magic vars 18823 1726855016.30089: variable 'item' from source: unknown 18823 1726855016.30159: variable 'item' from source: unknown 18823 1726855016.30179: variable 'omit' from source: magic vars 18823 1726855016.30216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855016.30250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855016.30273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855016.30297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855016.30311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855016.30344: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855016.30352: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855016.30360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855016.30454: Set connection var ansible_timeout to 10 18823 1726855016.30465: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855016.30471: Set connection var ansible_shell_type to sh 18823 1726855016.30479: Set connection var ansible_shell_executable to /bin/sh 18823 1726855016.30489: Set connection var ansible_connection to ssh 18823 1726855016.30498: Set connection var ansible_pipelining to False 18823 1726855016.30523: variable 'ansible_shell_executable' from source: unknown 18823 1726855016.30530: variable 'ansible_connection' from source: unknown 18823 1726855016.30536: variable 'ansible_module_compression' from source: unknown 18823 1726855016.30542: variable 'ansible_shell_type' from source: unknown 18823 1726855016.30548: variable 'ansible_shell_executable' from source: unknown 18823 1726855016.30553: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855016.30560: variable 'ansible_pipelining' from source: unknown 18823 1726855016.30565: variable 'ansible_timeout' from source: unknown 18823 1726855016.30571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855016.30696: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855016.30713: variable 'omit' from source: magic vars 18823 1726855016.30723: starting attempt loop 18823 1726855016.30729: running the handler 18823 1726855016.30746: _low_level_execute_command(): starting 18823 1726855016.30758: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855016.31434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855016.31449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855016.31465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.31484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855016.31590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.31604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.31621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.31720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.33369: stdout chunk (state=3): >>>/root <<< 18823 1726855016.33481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.33505: stderr chunk (state=3): >>><<< 18823 1726855016.33517: stdout chunk (state=3): >>><<< 18823 1726855016.33545: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.33662: _low_level_execute_command(): starting 18823 1726855016.33666: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706 `" && echo ansible-tmp-1726855016.3356192-19213-43261094830706="` echo /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706 `" ) && sleep 0' 18823 1726855016.34309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.34397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.36281: stdout chunk (state=3): >>>ansible-tmp-1726855016.3356192-19213-43261094830706=/root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706 <<< 18823 1726855016.36424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.36436: stdout chunk (state=3): >>><<< 18823 1726855016.36593: stderr chunk (state=3): >>><<< 18823 1726855016.36598: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855016.3356192-19213-43261094830706=/root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.36600: variable 'ansible_module_compression' from source: unknown 18823 1726855016.36603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855016.36605: variable 'ansible_facts' from source: unknown 18823 1726855016.36696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/AnsiballZ_command.py 18823 1726855016.36909: Sending initial data 18823 1726855016.36920: Sent initial data (155 bytes) 18823 1726855016.37427: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855016.37442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855016.37502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.37559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.37576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.37601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.37701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.39236: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855016.39325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855016.39424: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp9l5xsllp /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/AnsiballZ_command.py <<< 18823 1726855016.39434: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/AnsiballZ_command.py" <<< 18823 1726855016.39495: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp9l5xsllp" to remote "/root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/AnsiballZ_command.py" <<< 18823 1726855016.40336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.40345: stdout chunk (state=3): >>><<< 18823 1726855016.40354: stderr chunk (state=3): >>><<< 18823 1726855016.40384: done transferring module to remote 18823 1726855016.40470: _low_level_execute_command(): starting 18823 1726855016.40474: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/ /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/AnsiballZ_command.py && sleep 0' 18823 1726855016.41061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.41080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.41105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.41211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.43018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.43029: stdout chunk (state=3): >>><<< 18823 1726855016.43045: stderr chunk (state=3): >>><<< 18823 1726855016.43094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.43097: _low_level_execute_command(): starting 18823 1726855016.43100: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/AnsiballZ_command.py && sleep 0' 18823 1726855016.43771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855016.43827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.43908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855016.43935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.43979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.44062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.59839: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 13:56:56.588923", "end": "2024-09-20 13:56:56.593812", "delta": "0:00:00.004889", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855016.62008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855016.62014: stdout chunk (state=3): >>><<< 18823 1726855016.62017: stderr chunk (state=3): >>><<< 18823 1726855016.62150: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 13:56:56.588923", "end": "2024-09-20 13:56:56.593812", "delta": "0:00:00.004889", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855016.62154: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855016.62156: _low_level_execute_command(): starting 18823 1726855016.62157: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855016.3356192-19213-43261094830706/ > /dev/null 2>&1 && sleep 0' 18823 1726855016.62738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.62742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.62830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.62904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.66959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.66980: stderr chunk (state=3): >>><<< 18823 1726855016.66982: stdout chunk (state=3): >>><<< 18823 1726855016.66997: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.67048: handler run complete 18823 1726855016.67051: Evaluated conditional (False): False 18823 1726855016.67053: attempt loop complete, returning result 18823 1726855016.67055: variable 'item' from source: unknown 18823 1726855016.67113: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.004889", "end": "2024-09-20 13:56:56.593812", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-20 13:56:56.588923" } 18823 1726855016.67278: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855016.67281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855016.67284: variable 'omit' from source: magic vars 18823 1726855016.67349: variable 'ansible_distribution_major_version' from source: facts 18823 1726855016.67352: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855016.67472: variable 'type' from source: set_fact 18823 1726855016.67476: variable 'state' from source: include params 18823 1726855016.67479: variable 'interface' from source: set_fact 18823 1726855016.67481: variable 'current_interfaces' from source: set_fact 18823 1726855016.67490: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18823 1726855016.67493: variable 'omit' from source: magic vars 18823 1726855016.67512: variable 'omit' from source: magic vars 18823 1726855016.67535: variable 'item' from source: unknown 18823 1726855016.67578: variable 'item' from source: unknown 18823 1726855016.67591: variable 'omit' from source: magic vars 18823 1726855016.67608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855016.67616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855016.67623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855016.67632: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855016.67635: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855016.67638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855016.67685: Set connection var ansible_timeout to 10 18823 1726855016.67690: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855016.67693: Set connection var ansible_shell_type to sh 18823 1726855016.67698: Set connection var ansible_shell_executable to /bin/sh 18823 1726855016.67706: Set connection var ansible_connection to ssh 18823 1726855016.67708: Set connection var ansible_pipelining to False 18823 1726855016.67726: variable 'ansible_shell_executable' from source: unknown 18823 1726855016.67729: variable 'ansible_connection' from source: unknown 18823 1726855016.67732: variable 'ansible_module_compression' from source: unknown 18823 1726855016.67734: variable 'ansible_shell_type' from source: unknown 18823 1726855016.67736: variable 'ansible_shell_executable' from source: unknown 18823 1726855016.67738: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855016.67741: variable 'ansible_pipelining' from source: unknown 18823 1726855016.67743: variable 'ansible_timeout' from source: unknown 18823 1726855016.67745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855016.67810: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855016.67818: variable 'omit' from source: magic vars 18823 1726855016.67821: starting attempt loop 18823 1726855016.67824: running the handler 18823 1726855016.67831: _low_level_execute_command(): starting 18823 1726855016.67833: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855016.68268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.68272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.68274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855016.68276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855016.68278: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.68334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.68341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.68343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.68419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.70013: stdout chunk (state=3): >>>/root <<< 18823 1726855016.70156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.70164: stdout chunk (state=3): >>><<< 18823 1726855016.70166: stderr chunk (state=3): >>><<< 18823 1726855016.70267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.70271: _low_level_execute_command(): starting 18823 1726855016.70274: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472 `" && echo ansible-tmp-1726855016.701843-19213-192821502090472="` echo /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472 `" ) && sleep 0' 18823 1726855016.70807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855016.70811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855016.70908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.70912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.71010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.72925: stdout chunk (state=3): >>>ansible-tmp-1726855016.701843-19213-192821502090472=/root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472 <<< 18823 1726855016.73062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.73065: stdout chunk (state=3): >>><<< 18823 1726855016.73068: stderr chunk (state=3): >>><<< 18823 1726855016.73082: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855016.701843-19213-192821502090472=/root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.73214: variable 'ansible_module_compression' from source: unknown 18823 1726855016.73218: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855016.73220: variable 'ansible_facts' from source: unknown 18823 1726855016.73297: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/AnsiballZ_command.py 18823 1726855016.73614: Sending initial data 18823 1726855016.73622: Sent initial data (155 bytes) 18823 1726855016.74019: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855016.74031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.74074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.74086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.74161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.75717: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855016.75782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855016.75863: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp446ax2_c /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/AnsiballZ_command.py <<< 18823 1726855016.75866: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/AnsiballZ_command.py" <<< 18823 1726855016.75924: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp446ax2_c" to remote "/root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/AnsiballZ_command.py" <<< 18823 1726855016.76638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.76670: stderr chunk (state=3): >>><<< 18823 1726855016.76673: stdout chunk (state=3): >>><<< 18823 1726855016.76717: done transferring module to remote 18823 1726855016.76724: _low_level_execute_command(): starting 18823 1726855016.76731: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/ /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/AnsiballZ_command.py && sleep 0' 18823 1726855016.77143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.77146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855016.77148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855016.77150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.77152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.77202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.77206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.77282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.79019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.79042: stderr chunk (state=3): >>><<< 18823 1726855016.79045: stdout chunk (state=3): >>><<< 18823 1726855016.79060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.79063: _low_level_execute_command(): starting 18823 1726855016.79068: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/AnsiballZ_command.py && sleep 0' 18823 1726855016.79507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.79511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.79521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855016.79540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.79581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.79584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.79668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.95039: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 13:56:56.944752", "end": "2024-09-20 13:56:56.948198", "delta": "0:00:00.003446", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855016.96548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855016.96553: stdout chunk (state=3): >>><<< 18823 1726855016.96555: stderr chunk (state=3): >>><<< 18823 1726855016.96594: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 13:56:56.944752", "end": "2024-09-20 13:56:56.948198", "delta": "0:00:00.003446", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855016.96696: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855016.96700: _low_level_execute_command(): starting 18823 1726855016.96705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855016.701843-19213-192821502090472/ > /dev/null 2>&1 && sleep 0' 18823 1726855016.97307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855016.97325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855016.97351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855016.97371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855016.97464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855016.97468: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855016.97520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855016.97539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855016.97568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855016.97675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855016.99550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855016.99798: stdout chunk (state=3): >>><<< 18823 1726855016.99801: stderr chunk (state=3): >>><<< 18823 1726855016.99807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855016.99810: handler run complete 18823 1726855016.99813: Evaluated conditional (False): False 18823 1726855016.99815: attempt loop complete, returning result 18823 1726855016.99817: variable 'item' from source: unknown 18823 1726855016.99884: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003446", "end": "2024-09-20 13:56:56.948198", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-20 13:56:56.944752" } 18823 1726855017.00581: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.00584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.00688: variable 'omit' from source: magic vars 18823 1726855017.00759: variable 'ansible_distribution_major_version' from source: facts 18823 1726855017.00770: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855017.01185: variable 'type' from source: set_fact 18823 1726855017.01259: variable 'state' from source: include params 18823 1726855017.01269: variable 'interface' from source: set_fact 18823 1726855017.01470: variable 'current_interfaces' from source: set_fact 18823 1726855017.01473: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18823 1726855017.01476: variable 'omit' from source: magic vars 18823 1726855017.01478: variable 'omit' from source: magic vars 18823 1726855017.01480: variable 'item' from source: unknown 18823 1726855017.01635: variable 'item' from source: unknown 18823 1726855017.01656: variable 'omit' from source: magic vars 18823 1726855017.01827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855017.01850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855017.01942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855017.01945: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855017.01948: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.01950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.02192: Set connection var ansible_timeout to 10 18823 1726855017.02211: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855017.02263: Set connection var ansible_shell_type to sh 18823 1726855017.02289: Set connection var ansible_shell_executable to /bin/sh 18823 1726855017.02416: Set connection var ansible_connection to ssh 18823 1726855017.02421: Set connection var ansible_pipelining to False 18823 1726855017.02452: variable 'ansible_shell_executable' from source: unknown 18823 1726855017.02498: variable 'ansible_connection' from source: unknown 18823 1726855017.02510: variable 'ansible_module_compression' from source: unknown 18823 1726855017.02849: variable 'ansible_shell_type' from source: unknown 18823 1726855017.02852: variable 'ansible_shell_executable' from source: unknown 18823 1726855017.02856: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.02860: variable 'ansible_pipelining' from source: unknown 18823 1726855017.02863: variable 'ansible_timeout' from source: unknown 18823 1726855017.02865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.02932: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855017.02979: variable 'omit' from source: magic vars 18823 1726855017.03021: starting attempt loop 18823 1726855017.03308: running the handler 18823 1726855017.03312: _low_level_execute_command(): starting 18823 1726855017.03317: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855017.04473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.04520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.04615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.04718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.06295: stdout chunk (state=3): >>>/root <<< 18823 1726855017.06449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.06452: stdout chunk (state=3): >>><<< 18823 1726855017.06455: stderr chunk (state=3): >>><<< 18823 1726855017.06552: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.06556: _low_level_execute_command(): starting 18823 1726855017.06559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807 `" && echo ansible-tmp-1726855017.0647202-19213-178186340855807="` echo /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807 `" ) && sleep 0' 18823 1726855017.07218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.07256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855017.07272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855017.07310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855017.07415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.07505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.07523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.07544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.07691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.09558: stdout chunk (state=3): >>>ansible-tmp-1726855017.0647202-19213-178186340855807=/root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807 <<< 18823 1726855017.09717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.09721: stdout chunk (state=3): >>><<< 18823 1726855017.09723: stderr chunk (state=3): >>><<< 18823 1726855017.09740: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855017.0647202-19213-178186340855807=/root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.10095: variable 'ansible_module_compression' from source: unknown 18823 1726855017.10099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855017.10101: variable 'ansible_facts' from source: unknown 18823 1726855017.10107: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/AnsiballZ_command.py 18823 1726855017.10436: Sending initial data 18823 1726855017.10500: Sent initial data (156 bytes) 18823 1726855017.11347: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855017.11360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.11375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.11632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.11710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.11738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.13295: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18823 1726855017.13299: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855017.13358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855017.13440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpsnf3kmej /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/AnsiballZ_command.py <<< 18823 1726855017.13443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/AnsiballZ_command.py" <<< 18823 1726855017.13580: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpsnf3kmej" to remote "/root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/AnsiballZ_command.py" <<< 18823 1726855017.14708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.14760: stderr chunk (state=3): >>><<< 18823 1726855017.14769: stdout chunk (state=3): >>><<< 18823 1726855017.14875: done transferring module to remote 18823 1726855017.14878: _low_level_execute_command(): starting 18823 1726855017.14881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/ /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/AnsiballZ_command.py && sleep 0' 18823 1726855017.15982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.16197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.16215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.16312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.18139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.18143: stdout chunk (state=3): >>><<< 18823 1726855017.18150: stderr chunk (state=3): >>><<< 18823 1726855017.18163: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.18166: _low_level_execute_command(): starting 18823 1726855017.18171: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/AnsiballZ_command.py && sleep 0' 18823 1726855017.19438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.19442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855017.19444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855017.19447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.19449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855017.19451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855017.19453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.19492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.19499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.19515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.19667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.35018: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 13:56:57.343951", "end": "2024-09-20 13:56:57.347617", "delta": "0:00:00.003666", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 18823 1726855017.35051: stdout chunk (state=3): >>> <<< 18823 1726855017.36710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855017.36714: stdout chunk (state=3): >>><<< 18823 1726855017.36716: stderr chunk (state=3): >>><<< 18823 1726855017.36719: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 13:56:57.343951", "end": "2024-09-20 13:56:57.347617", "delta": "0:00:00.003666", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855017.36722: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855017.36724: _low_level_execute_command(): starting 18823 1726855017.36726: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855017.0647202-19213-178186340855807/ > /dev/null 2>&1 && sleep 0' 18823 1726855017.37852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.37861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855017.37994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855017.37998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855017.38000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855017.38003: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855017.38005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.38007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855017.38101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.38113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.38219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.40144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.40148: stdout chunk (state=3): >>><<< 18823 1726855017.40154: stderr chunk (state=3): >>><<< 18823 1726855017.40170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.40175: handler run complete 18823 1726855017.40201: Evaluated conditional (False): False 18823 1726855017.40211: attempt loop complete, returning result 18823 1726855017.40230: variable 'item' from source: unknown 18823 1726855017.40431: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003666", "end": "2024-09-20 13:56:57.347617", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-20 13:56:57.343951" } 18823 1726855017.40700: dumping result to json 18823 1726855017.40706: done dumping result, returning 18823 1726855017.40709: done running TaskExecutor() for managed_node2/TASK: Create veth interface lsr27 [0affcc66-ac2b-d391-077c-000000000135] 18823 1726855017.40712: sending task result for task 0affcc66-ac2b-d391-077c-000000000135 18823 1726855017.40861: done sending task result for task 0affcc66-ac2b-d391-077c-000000000135 18823 1726855017.40864: WORKER PROCESS EXITING 18823 1726855017.40954: no more pending results, returning what we have 18823 1726855017.40961: results queue empty 18823 1726855017.40962: checking for any_errors_fatal 18823 1726855017.40970: done checking for any_errors_fatal 18823 1726855017.40971: checking for max_fail_percentage 18823 1726855017.40973: done checking for max_fail_percentage 18823 1726855017.40973: checking to see if all hosts have failed and the running result is not ok 18823 1726855017.40974: done checking to see if all hosts have failed 18823 1726855017.40975: getting the remaining hosts for this loop 18823 1726855017.40977: done getting the remaining hosts for this loop 18823 1726855017.40980: getting the next task for host managed_node2 18823 1726855017.40990: done getting next task for host managed_node2 18823 1726855017.40993: ^ task is: TASK: Set up veth as managed by NetworkManager 18823 1726855017.40996: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855017.41000: getting variables 18823 1726855017.41001: in VariableManager get_vars() 18823 1726855017.41030: Calling all_inventory to load vars for managed_node2 18823 1726855017.41033: Calling groups_inventory to load vars for managed_node2 18823 1726855017.41037: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855017.41049: Calling all_plugins_play to load vars for managed_node2 18823 1726855017.41052: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855017.41055: Calling groups_plugins_play to load vars for managed_node2 18823 1726855017.41461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855017.41927: done with get_vars() 18823 1726855017.41937: done getting variables 18823 1726855017.42598: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:56:57 -0400 (0:00:01.151) 0:00:09.077 ****** 18823 1726855017.42627: entering _queue_task() for managed_node2/command 18823 1726855017.43296: worker is 1 (out of 1 available) 18823 1726855017.43309: exiting _queue_task() for managed_node2/command 18823 1726855017.43319: done queuing things up, now waiting for results queue to drain 18823 1726855017.43320: waiting for pending results... 18823 1726855017.43929: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 18823 1726855017.43973: in run() - task 0affcc66-ac2b-d391-077c-000000000136 18823 1726855017.44011: variable 'ansible_search_path' from source: unknown 18823 1726855017.44140: variable 'ansible_search_path' from source: unknown 18823 1726855017.44156: calling self._execute() 18823 1726855017.44292: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.44392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.44397: variable 'omit' from source: magic vars 18823 1726855017.45112: variable 'ansible_distribution_major_version' from source: facts 18823 1726855017.45335: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855017.45562: variable 'type' from source: set_fact 18823 1726855017.45572: variable 'state' from source: include params 18823 1726855017.45582: Evaluated conditional (type == 'veth' and state == 'present'): True 18823 1726855017.45596: variable 'omit' from source: magic vars 18823 1726855017.45639: variable 'omit' from source: magic vars 18823 1726855017.45908: variable 'interface' from source: set_fact 18823 1726855017.45932: variable 'omit' from source: magic vars 18823 1726855017.46031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855017.46131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855017.46157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855017.46220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855017.46394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855017.46397: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855017.46400: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.46401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.46636: Set connection var ansible_timeout to 10 18823 1726855017.46643: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855017.46651: Set connection var ansible_shell_type to sh 18823 1726855017.46662: Set connection var ansible_shell_executable to /bin/sh 18823 1726855017.46673: Set connection var ansible_connection to ssh 18823 1726855017.46684: Set connection var ansible_pipelining to False 18823 1726855017.46720: variable 'ansible_shell_executable' from source: unknown 18823 1726855017.46728: variable 'ansible_connection' from source: unknown 18823 1726855017.46735: variable 'ansible_module_compression' from source: unknown 18823 1726855017.46798: variable 'ansible_shell_type' from source: unknown 18823 1726855017.46810: variable 'ansible_shell_executable' from source: unknown 18823 1726855017.46819: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.46828: variable 'ansible_pipelining' from source: unknown 18823 1726855017.46835: variable 'ansible_timeout' from source: unknown 18823 1726855017.46842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.47140: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855017.47395: variable 'omit' from source: magic vars 18823 1726855017.47398: starting attempt loop 18823 1726855017.47401: running the handler 18823 1726855017.47406: _low_level_execute_command(): starting 18823 1726855017.47409: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855017.48912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.48933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.49109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.49137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.49227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.50852: stdout chunk (state=3): >>>/root <<< 18823 1726855017.51081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.51085: stdout chunk (state=3): >>><<< 18823 1726855017.51091: stderr chunk (state=3): >>><<< 18823 1726855017.51114: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.51134: _low_level_execute_command(): starting 18823 1726855017.51146: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461 `" && echo ansible-tmp-1726855017.5112143-19295-8969085381461="` echo /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461 `" ) && sleep 0' 18823 1726855017.52315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.52592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.52620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.52643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.52898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.54643: stdout chunk (state=3): >>>ansible-tmp-1726855017.5112143-19295-8969085381461=/root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461 <<< 18823 1726855017.54799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.54805: stdout chunk (state=3): >>><<< 18823 1726855017.54807: stderr chunk (state=3): >>><<< 18823 1726855017.54997: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855017.5112143-19295-8969085381461=/root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.55001: variable 'ansible_module_compression' from source: unknown 18823 1726855017.55006: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855017.55008: variable 'ansible_facts' from source: unknown 18823 1726855017.55230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/AnsiballZ_command.py 18823 1726855017.55627: Sending initial data 18823 1726855017.55801: Sent initial data (154 bytes) 18823 1726855017.57193: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855017.57210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855017.57223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.57385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.57685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.57878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.59528: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855017.59596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855017.59676: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpxp9b5_3m /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/AnsiballZ_command.py <<< 18823 1726855017.59680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/AnsiballZ_command.py" <<< 18823 1726855017.59741: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpxp9b5_3m" to remote "/root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/AnsiballZ_command.py" <<< 18823 1726855017.61182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.61295: stderr chunk (state=3): >>><<< 18823 1726855017.61298: stdout chunk (state=3): >>><<< 18823 1726855017.61300: done transferring module to remote 18823 1726855017.61304: _low_level_execute_command(): starting 18823 1726855017.61307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/ /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/AnsiballZ_command.py && sleep 0' 18823 1726855017.62322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.62508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855017.62605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.62807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.62848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.62919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.64735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.64739: stdout chunk (state=3): >>><<< 18823 1726855017.64741: stderr chunk (state=3): >>><<< 18823 1726855017.64756: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.64768: _low_level_execute_command(): starting 18823 1726855017.64778: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/AnsiballZ_command.py && sleep 0' 18823 1726855017.65844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.66200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855017.66229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.66263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.66337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.83226: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 13:56:57.810229", "end": "2024-09-20 13:56:57.827890", "delta": "0:00:00.017661", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855017.85047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855017.85051: stdout chunk (state=3): >>><<< 18823 1726855017.85054: stderr chunk (state=3): >>><<< 18823 1726855017.85057: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 13:56:57.810229", "end": "2024-09-20 13:56:57.827890", "delta": "0:00:00.017661", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855017.85060: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855017.85062: _low_level_execute_command(): starting 18823 1726855017.85064: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855017.5112143-19295-8969085381461/ > /dev/null 2>&1 && sleep 0' 18823 1726855017.86323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855017.86568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855017.86635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855017.86656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855017.86764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855017.88660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855017.88670: stdout chunk (state=3): >>><<< 18823 1726855017.88702: stderr chunk (state=3): >>><<< 18823 1726855017.88740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855017.88756: handler run complete 18823 1726855017.88815: Evaluated conditional (False): False 18823 1726855017.88945: attempt loop complete, returning result 18823 1726855017.88949: _execute() done 18823 1726855017.88951: dumping result to json 18823 1726855017.88953: done dumping result, returning 18823 1726855017.88955: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0affcc66-ac2b-d391-077c-000000000136] 18823 1726855017.88957: sending task result for task 0affcc66-ac2b-d391-077c-000000000136 ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.017661", "end": "2024-09-20 13:56:57.827890", "rc": 0, "start": "2024-09-20 13:56:57.810229" } 18823 1726855017.89291: no more pending results, returning what we have 18823 1726855017.89295: results queue empty 18823 1726855017.89296: checking for any_errors_fatal 18823 1726855017.89309: done checking for any_errors_fatal 18823 1726855017.89310: checking for max_fail_percentage 18823 1726855017.89312: done checking for max_fail_percentage 18823 1726855017.89313: checking to see if all hosts have failed and the running result is not ok 18823 1726855017.89314: done checking to see if all hosts have failed 18823 1726855017.89315: getting the remaining hosts for this loop 18823 1726855017.89317: done getting the remaining hosts for this loop 18823 1726855017.89321: getting the next task for host managed_node2 18823 1726855017.89328: done getting next task for host managed_node2 18823 1726855017.89330: ^ task is: TASK: Delete veth interface {{ interface }} 18823 1726855017.89333: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855017.89338: getting variables 18823 1726855017.89340: in VariableManager get_vars() 18823 1726855017.89369: Calling all_inventory to load vars for managed_node2 18823 1726855017.89373: Calling groups_inventory to load vars for managed_node2 18823 1726855017.89376: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855017.89592: Calling all_plugins_play to load vars for managed_node2 18823 1726855017.89597: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855017.89601: Calling groups_plugins_play to load vars for managed_node2 18823 1726855017.89771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855017.90438: done with get_vars() 18823 1726855017.90449: done getting variables 18823 1726855017.90595: done sending task result for task 0affcc66-ac2b-d391-077c-000000000136 18823 1726855017.90599: WORKER PROCESS EXITING 18823 1726855017.90633: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855017.90750: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:56:57 -0400 (0:00:00.481) 0:00:09.559 ****** 18823 1726855017.90779: entering _queue_task() for managed_node2/command 18823 1726855017.91524: worker is 1 (out of 1 available) 18823 1726855017.91532: exiting _queue_task() for managed_node2/command 18823 1726855017.91541: done queuing things up, now waiting for results queue to drain 18823 1726855017.91542: waiting for pending results... 18823 1726855017.91559: running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr27 18823 1726855017.91899: in run() - task 0affcc66-ac2b-d391-077c-000000000137 18823 1726855017.91922: variable 'ansible_search_path' from source: unknown 18823 1726855017.91931: variable 'ansible_search_path' from source: unknown 18823 1726855017.91972: calling self._execute() 18823 1726855017.92057: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.92292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.92316: variable 'omit' from source: magic vars 18823 1726855017.93016: variable 'ansible_distribution_major_version' from source: facts 18823 1726855017.93033: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855017.93619: variable 'type' from source: set_fact 18823 1726855017.93627: variable 'state' from source: include params 18823 1726855017.93630: variable 'interface' from source: set_fact 18823 1726855017.93633: variable 'current_interfaces' from source: set_fact 18823 1726855017.93636: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 18823 1726855017.93638: when evaluation is False, skipping this task 18823 1726855017.93640: _execute() done 18823 1726855017.93643: dumping result to json 18823 1726855017.93645: done dumping result, returning 18823 1726855017.93647: done running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr27 [0affcc66-ac2b-d391-077c-000000000137] 18823 1726855017.93649: sending task result for task 0affcc66-ac2b-d391-077c-000000000137 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18823 1726855017.93806: no more pending results, returning what we have 18823 1726855017.93811: results queue empty 18823 1726855017.93812: checking for any_errors_fatal 18823 1726855017.93820: done checking for any_errors_fatal 18823 1726855017.93821: checking for max_fail_percentage 18823 1726855017.93823: done checking for max_fail_percentage 18823 1726855017.93823: checking to see if all hosts have failed and the running result is not ok 18823 1726855017.93824: done checking to see if all hosts have failed 18823 1726855017.93825: getting the remaining hosts for this loop 18823 1726855017.93826: done getting the remaining hosts for this loop 18823 1726855017.93830: getting the next task for host managed_node2 18823 1726855017.93837: done getting next task for host managed_node2 18823 1726855017.93839: ^ task is: TASK: Create dummy interface {{ interface }} 18823 1726855017.93843: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855017.93847: getting variables 18823 1726855017.93849: in VariableManager get_vars() 18823 1726855017.93877: Calling all_inventory to load vars for managed_node2 18823 1726855017.93881: Calling groups_inventory to load vars for managed_node2 18823 1726855017.93884: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855017.93901: Calling all_plugins_play to load vars for managed_node2 18823 1726855017.93904: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855017.93907: Calling groups_plugins_play to load vars for managed_node2 18823 1726855017.93917: done sending task result for task 0affcc66-ac2b-d391-077c-000000000137 18823 1726855017.93921: WORKER PROCESS EXITING 18823 1726855017.94557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855017.95405: done with get_vars() 18823 1726855017.95414: done getting variables 18823 1726855017.95466: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855017.95569: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:56:57 -0400 (0:00:00.052) 0:00:09.611 ****** 18823 1726855017.96001: entering _queue_task() for managed_node2/command 18823 1726855017.96643: worker is 1 (out of 1 available) 18823 1726855017.96655: exiting _queue_task() for managed_node2/command 18823 1726855017.96666: done queuing things up, now waiting for results queue to drain 18823 1726855017.96667: waiting for pending results... 18823 1726855017.97405: running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr27 18823 1726855017.97410: in run() - task 0affcc66-ac2b-d391-077c-000000000138 18823 1726855017.97413: variable 'ansible_search_path' from source: unknown 18823 1726855017.97416: variable 'ansible_search_path' from source: unknown 18823 1726855017.97419: calling self._execute() 18823 1726855017.97508: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855017.97520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855017.97537: variable 'omit' from source: magic vars 18823 1726855017.97915: variable 'ansible_distribution_major_version' from source: facts 18823 1726855017.97933: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855017.98144: variable 'type' from source: set_fact 18823 1726855017.98155: variable 'state' from source: include params 18823 1726855017.98165: variable 'interface' from source: set_fact 18823 1726855017.98173: variable 'current_interfaces' from source: set_fact 18823 1726855017.98185: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 18823 1726855017.98194: when evaluation is False, skipping this task 18823 1726855017.98205: _execute() done 18823 1726855017.98214: dumping result to json 18823 1726855017.98222: done dumping result, returning 18823 1726855017.98238: done running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr27 [0affcc66-ac2b-d391-077c-000000000138] 18823 1726855017.98250: sending task result for task 0affcc66-ac2b-d391-077c-000000000138 18823 1726855017.98395: done sending task result for task 0affcc66-ac2b-d391-077c-000000000138 18823 1726855017.98399: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18823 1726855017.98446: no more pending results, returning what we have 18823 1726855017.98450: results queue empty 18823 1726855017.98451: checking for any_errors_fatal 18823 1726855017.98456: done checking for any_errors_fatal 18823 1726855017.98456: checking for max_fail_percentage 18823 1726855017.98458: done checking for max_fail_percentage 18823 1726855017.98459: checking to see if all hosts have failed and the running result is not ok 18823 1726855017.98459: done checking to see if all hosts have failed 18823 1726855017.98460: getting the remaining hosts for this loop 18823 1726855017.98462: done getting the remaining hosts for this loop 18823 1726855017.98465: getting the next task for host managed_node2 18823 1726855017.98473: done getting next task for host managed_node2 18823 1726855017.98476: ^ task is: TASK: Delete dummy interface {{ interface }} 18823 1726855017.98479: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855017.98483: getting variables 18823 1726855017.98484: in VariableManager get_vars() 18823 1726855017.98514: Calling all_inventory to load vars for managed_node2 18823 1726855017.98517: Calling groups_inventory to load vars for managed_node2 18823 1726855017.98520: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855017.98532: Calling all_plugins_play to load vars for managed_node2 18823 1726855017.98534: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855017.98537: Calling groups_plugins_play to load vars for managed_node2 18823 1726855017.98758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855017.99649: done with get_vars() 18823 1726855017.99659: done getting variables 18823 1726855017.99716: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855018.00227: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:56:58 -0400 (0:00:00.042) 0:00:09.654 ****** 18823 1726855018.00255: entering _queue_task() for managed_node2/command 18823 1726855018.00928: worker is 1 (out of 1 available) 18823 1726855018.00942: exiting _queue_task() for managed_node2/command 18823 1726855018.00955: done queuing things up, now waiting for results queue to drain 18823 1726855018.00956: waiting for pending results... 18823 1726855018.01210: running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr27 18823 1726855018.01496: in run() - task 0affcc66-ac2b-d391-077c-000000000139 18823 1726855018.01536: variable 'ansible_search_path' from source: unknown 18823 1726855018.01792: variable 'ansible_search_path' from source: unknown 18823 1726855018.01797: calling self._execute() 18823 1726855018.01829: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.01841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.01856: variable 'omit' from source: magic vars 18823 1726855018.02443: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.02773: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.02995: variable 'type' from source: set_fact 18823 1726855018.03116: variable 'state' from source: include params 18823 1726855018.03124: variable 'interface' from source: set_fact 18823 1726855018.03131: variable 'current_interfaces' from source: set_fact 18823 1726855018.03144: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 18823 1726855018.03150: when evaluation is False, skipping this task 18823 1726855018.03156: _execute() done 18823 1726855018.03185: dumping result to json 18823 1726855018.03194: done dumping result, returning 18823 1726855018.03208: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr27 [0affcc66-ac2b-d391-077c-000000000139] 18823 1726855018.03300: sending task result for task 0affcc66-ac2b-d391-077c-000000000139 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18823 1726855018.03450: no more pending results, returning what we have 18823 1726855018.03455: results queue empty 18823 1726855018.03456: checking for any_errors_fatal 18823 1726855018.03462: done checking for any_errors_fatal 18823 1726855018.03463: checking for max_fail_percentage 18823 1726855018.03464: done checking for max_fail_percentage 18823 1726855018.03464: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.03465: done checking to see if all hosts have failed 18823 1726855018.03466: getting the remaining hosts for this loop 18823 1726855018.03467: done getting the remaining hosts for this loop 18823 1726855018.03471: getting the next task for host managed_node2 18823 1726855018.03477: done getting next task for host managed_node2 18823 1726855018.03480: ^ task is: TASK: Create tap interface {{ interface }} 18823 1726855018.03483: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.03489: getting variables 18823 1726855018.03492: in VariableManager get_vars() 18823 1726855018.03524: Calling all_inventory to load vars for managed_node2 18823 1726855018.03527: Calling groups_inventory to load vars for managed_node2 18823 1726855018.03531: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.03546: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.03550: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.03553: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.04513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.05105: done with get_vars() 18823 1726855018.05115: done getting variables 18823 1726855018.05148: done sending task result for task 0affcc66-ac2b-d391-077c-000000000139 18823 1726855018.05151: WORKER PROCESS EXITING 18823 1726855018.05183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855018.05598: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:56:58 -0400 (0:00:00.053) 0:00:09.708 ****** 18823 1726855018.05634: entering _queue_task() for managed_node2/command 18823 1726855018.06039: worker is 1 (out of 1 available) 18823 1726855018.06052: exiting _queue_task() for managed_node2/command 18823 1726855018.06064: done queuing things up, now waiting for results queue to drain 18823 1726855018.06065: waiting for pending results... 18823 1726855018.06533: running TaskExecutor() for managed_node2/TASK: Create tap interface lsr27 18823 1726855018.06638: in run() - task 0affcc66-ac2b-d391-077c-00000000013a 18823 1726855018.06659: variable 'ansible_search_path' from source: unknown 18823 1726855018.06668: variable 'ansible_search_path' from source: unknown 18823 1726855018.06717: calling self._execute() 18823 1726855018.06797: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.06814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.06831: variable 'omit' from source: magic vars 18823 1726855018.07206: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.07225: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.07474: variable 'type' from source: set_fact 18823 1726855018.07484: variable 'state' from source: include params 18823 1726855018.07495: variable 'interface' from source: set_fact 18823 1726855018.07507: variable 'current_interfaces' from source: set_fact 18823 1726855018.07649: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 18823 1726855018.07698: when evaluation is False, skipping this task 18823 1726855018.07708: _execute() done 18823 1726855018.07725: dumping result to json 18823 1726855018.07733: done dumping result, returning 18823 1726855018.07743: done running TaskExecutor() for managed_node2/TASK: Create tap interface lsr27 [0affcc66-ac2b-d391-077c-00000000013a] 18823 1726855018.07765: sending task result for task 0affcc66-ac2b-d391-077c-00000000013a skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18823 1726855018.07935: no more pending results, returning what we have 18823 1726855018.07939: results queue empty 18823 1726855018.07940: checking for any_errors_fatal 18823 1726855018.07945: done checking for any_errors_fatal 18823 1726855018.07946: checking for max_fail_percentage 18823 1726855018.07948: done checking for max_fail_percentage 18823 1726855018.07948: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.07949: done checking to see if all hosts have failed 18823 1726855018.07949: getting the remaining hosts for this loop 18823 1726855018.07951: done getting the remaining hosts for this loop 18823 1726855018.07954: getting the next task for host managed_node2 18823 1726855018.07960: done getting next task for host managed_node2 18823 1726855018.07962: ^ task is: TASK: Delete tap interface {{ interface }} 18823 1726855018.07965: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.07970: getting variables 18823 1726855018.07971: in VariableManager get_vars() 18823 1726855018.08001: Calling all_inventory to load vars for managed_node2 18823 1726855018.08004: Calling groups_inventory to load vars for managed_node2 18823 1726855018.08008: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.08023: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.08027: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.08030: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.08413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.09200: done sending task result for task 0affcc66-ac2b-d391-077c-00000000013a 18823 1726855018.09204: WORKER PROCESS EXITING 18823 1726855018.09241: done with get_vars() 18823 1726855018.09252: done getting variables 18823 1726855018.09311: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855018.09523: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:56:58 -0400 (0:00:00.039) 0:00:09.747 ****** 18823 1726855018.09552: entering _queue_task() for managed_node2/command 18823 1726855018.10035: worker is 1 (out of 1 available) 18823 1726855018.10047: exiting _queue_task() for managed_node2/command 18823 1726855018.10058: done queuing things up, now waiting for results queue to drain 18823 1726855018.10059: waiting for pending results... 18823 1726855018.10520: running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr27 18823 1726855018.10640: in run() - task 0affcc66-ac2b-d391-077c-00000000013b 18823 1726855018.10661: variable 'ansible_search_path' from source: unknown 18823 1726855018.10674: variable 'ansible_search_path' from source: unknown 18823 1726855018.10720: calling self._execute() 18823 1726855018.10812: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.10824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.10840: variable 'omit' from source: magic vars 18823 1726855018.11526: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.11566: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.12006: variable 'type' from source: set_fact 18823 1726855018.12105: variable 'state' from source: include params 18823 1726855018.12117: variable 'interface' from source: set_fact 18823 1726855018.12126: variable 'current_interfaces' from source: set_fact 18823 1726855018.12139: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 18823 1726855018.12167: when evaluation is False, skipping this task 18823 1726855018.12381: _execute() done 18823 1726855018.12385: dumping result to json 18823 1726855018.12389: done dumping result, returning 18823 1726855018.12392: done running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr27 [0affcc66-ac2b-d391-077c-00000000013b] 18823 1726855018.12395: sending task result for task 0affcc66-ac2b-d391-077c-00000000013b 18823 1726855018.12464: done sending task result for task 0affcc66-ac2b-d391-077c-00000000013b 18823 1726855018.12469: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18823 1726855018.12526: no more pending results, returning what we have 18823 1726855018.12531: results queue empty 18823 1726855018.12532: checking for any_errors_fatal 18823 1726855018.12538: done checking for any_errors_fatal 18823 1726855018.12539: checking for max_fail_percentage 18823 1726855018.12541: done checking for max_fail_percentage 18823 1726855018.12541: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.12542: done checking to see if all hosts have failed 18823 1726855018.12543: getting the remaining hosts for this loop 18823 1726855018.12544: done getting the remaining hosts for this loop 18823 1726855018.12549: getting the next task for host managed_node2 18823 1726855018.12559: done getting next task for host managed_node2 18823 1726855018.12564: ^ task is: TASK: Include the task 'assert_device_present.yml' 18823 1726855018.12567: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.12571: getting variables 18823 1726855018.12572: in VariableManager get_vars() 18823 1726855018.12606: Calling all_inventory to load vars for managed_node2 18823 1726855018.12609: Calling groups_inventory to load vars for managed_node2 18823 1726855018.12612: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.12625: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.12628: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.12630: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.13053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.13670: done with get_vars() 18823 1726855018.13681: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 13:56:58 -0400 (0:00:00.046) 0:00:09.793 ****** 18823 1726855018.14179: entering _queue_task() for managed_node2/include_tasks 18823 1726855018.15336: worker is 1 (out of 1 available) 18823 1726855018.15345: exiting _queue_task() for managed_node2/include_tasks 18823 1726855018.15355: done queuing things up, now waiting for results queue to drain 18823 1726855018.15356: waiting for pending results... 18823 1726855018.15508: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 18823 1726855018.15648: in run() - task 0affcc66-ac2b-d391-077c-000000000012 18823 1726855018.15704: variable 'ansible_search_path' from source: unknown 18823 1726855018.15756: calling self._execute() 18823 1726855018.15949: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.15961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.16097: variable 'omit' from source: magic vars 18823 1726855018.16824: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.16843: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.16854: _execute() done 18823 1726855018.16870: dumping result to json 18823 1726855018.16900: done dumping result, returning 18823 1726855018.16918: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0affcc66-ac2b-d391-077c-000000000012] 18823 1726855018.16928: sending task result for task 0affcc66-ac2b-d391-077c-000000000012 18823 1726855018.17311: done sending task result for task 0affcc66-ac2b-d391-077c-000000000012 18823 1726855018.17315: WORKER PROCESS EXITING 18823 1726855018.17346: no more pending results, returning what we have 18823 1726855018.17352: in VariableManager get_vars() 18823 1726855018.17389: Calling all_inventory to load vars for managed_node2 18823 1726855018.17392: Calling groups_inventory to load vars for managed_node2 18823 1726855018.17395: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.17410: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.17414: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.17417: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.17791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.18183: done with get_vars() 18823 1726855018.18193: variable 'ansible_search_path' from source: unknown 18823 1726855018.18207: we have included files to process 18823 1726855018.18208: generating all_blocks data 18823 1726855018.18210: done generating all_blocks data 18823 1726855018.18214: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18823 1726855018.18216: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18823 1726855018.18218: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18823 1726855018.18578: in VariableManager get_vars() 18823 1726855018.18596: done with get_vars() 18823 1726855018.18773: done processing included file 18823 1726855018.18776: iterating over new_blocks loaded from include file 18823 1726855018.18777: in VariableManager get_vars() 18823 1726855018.18790: done with get_vars() 18823 1726855018.18791: filtering new block on tags 18823 1726855018.18809: done filtering new block on tags 18823 1726855018.18811: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 18823 1726855018.18816: extending task lists for all hosts with included blocks 18823 1726855018.20347: done extending task lists 18823 1726855018.20464: done processing included files 18823 1726855018.20466: results queue empty 18823 1726855018.20466: checking for any_errors_fatal 18823 1726855018.20470: done checking for any_errors_fatal 18823 1726855018.20471: checking for max_fail_percentage 18823 1726855018.20472: done checking for max_fail_percentage 18823 1726855018.20473: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.20473: done checking to see if all hosts have failed 18823 1726855018.20474: getting the remaining hosts for this loop 18823 1726855018.20475: done getting the remaining hosts for this loop 18823 1726855018.20478: getting the next task for host managed_node2 18823 1726855018.20482: done getting next task for host managed_node2 18823 1726855018.20484: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18823 1726855018.20489: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.20491: getting variables 18823 1726855018.20492: in VariableManager get_vars() 18823 1726855018.20502: Calling all_inventory to load vars for managed_node2 18823 1726855018.20507: Calling groups_inventory to load vars for managed_node2 18823 1726855018.20509: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.20514: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.20516: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.20519: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.20779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.21179: done with get_vars() 18823 1726855018.21340: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:56:58 -0400 (0:00:00.072) 0:00:09.865 ****** 18823 1726855018.21418: entering _queue_task() for managed_node2/include_tasks 18823 1726855018.22428: worker is 1 (out of 1 available) 18823 1726855018.22438: exiting _queue_task() for managed_node2/include_tasks 18823 1726855018.22448: done queuing things up, now waiting for results queue to drain 18823 1726855018.22449: waiting for pending results... 18823 1726855018.22693: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 18823 1726855018.22861: in run() - task 0affcc66-ac2b-d391-077c-0000000001d3 18823 1726855018.22913: variable 'ansible_search_path' from source: unknown 18823 1726855018.23114: variable 'ansible_search_path' from source: unknown 18823 1726855018.23117: calling self._execute() 18823 1726855018.23226: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.23239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.23256: variable 'omit' from source: magic vars 18823 1726855018.24061: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.24297: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.24300: _execute() done 18823 1726855018.24306: dumping result to json 18823 1726855018.24309: done dumping result, returning 18823 1726855018.24311: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-d391-077c-0000000001d3] 18823 1726855018.24314: sending task result for task 0affcc66-ac2b-d391-077c-0000000001d3 18823 1726855018.24386: done sending task result for task 0affcc66-ac2b-d391-077c-0000000001d3 18823 1726855018.24425: no more pending results, returning what we have 18823 1726855018.24431: in VariableManager get_vars() 18823 1726855018.24470: Calling all_inventory to load vars for managed_node2 18823 1726855018.24474: Calling groups_inventory to load vars for managed_node2 18823 1726855018.24478: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.24499: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.24505: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.24509: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.25147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.25630: done with get_vars() 18823 1726855018.25639: variable 'ansible_search_path' from source: unknown 18823 1726855018.25640: variable 'ansible_search_path' from source: unknown 18823 1726855018.25774: WORKER PROCESS EXITING 18823 1726855018.25820: we have included files to process 18823 1726855018.25821: generating all_blocks data 18823 1726855018.25823: done generating all_blocks data 18823 1726855018.25824: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18823 1726855018.25826: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18823 1726855018.25828: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18823 1726855018.26293: done processing included file 18823 1726855018.26295: iterating over new_blocks loaded from include file 18823 1726855018.26297: in VariableManager get_vars() 18823 1726855018.26311: done with get_vars() 18823 1726855018.26426: filtering new block on tags 18823 1726855018.26442: done filtering new block on tags 18823 1726855018.26445: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 18823 1726855018.26449: extending task lists for all hosts with included blocks 18823 1726855018.26764: done extending task lists 18823 1726855018.26765: done processing included files 18823 1726855018.26766: results queue empty 18823 1726855018.26766: checking for any_errors_fatal 18823 1726855018.26770: done checking for any_errors_fatal 18823 1726855018.26771: checking for max_fail_percentage 18823 1726855018.26772: done checking for max_fail_percentage 18823 1726855018.26772: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.26773: done checking to see if all hosts have failed 18823 1726855018.26774: getting the remaining hosts for this loop 18823 1726855018.26775: done getting the remaining hosts for this loop 18823 1726855018.26777: getting the next task for host managed_node2 18823 1726855018.26781: done getting next task for host managed_node2 18823 1726855018.26784: ^ task is: TASK: Get stat for interface {{ interface }} 18823 1726855018.26786: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.26790: getting variables 18823 1726855018.26791: in VariableManager get_vars() 18823 1726855018.26799: Calling all_inventory to load vars for managed_node2 18823 1726855018.26801: Calling groups_inventory to load vars for managed_node2 18823 1726855018.26806: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.26811: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.26813: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.26815: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.27073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.27476: done with get_vars() 18823 1726855018.27485: done getting variables 18823 1726855018.27782: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:56:58 -0400 (0:00:00.063) 0:00:09.929 ****** 18823 1726855018.27816: entering _queue_task() for managed_node2/stat 18823 1726855018.28559: worker is 1 (out of 1 available) 18823 1726855018.28572: exiting _queue_task() for managed_node2/stat 18823 1726855018.28589: done queuing things up, now waiting for results queue to drain 18823 1726855018.28591: waiting for pending results... 18823 1726855018.29122: running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 18823 1726855018.29537: in run() - task 0affcc66-ac2b-d391-077c-00000000021e 18823 1726855018.29643: variable 'ansible_search_path' from source: unknown 18823 1726855018.29646: variable 'ansible_search_path' from source: unknown 18823 1726855018.29649: calling self._execute() 18823 1726855018.29676: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.29759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.29775: variable 'omit' from source: magic vars 18823 1726855018.30572: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.30660: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.30673: variable 'omit' from source: magic vars 18823 1726855018.30741: variable 'omit' from source: magic vars 18823 1726855018.30980: variable 'interface' from source: set_fact 18823 1726855018.31113: variable 'omit' from source: magic vars 18823 1726855018.31166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855018.31321: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855018.31351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855018.31374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855018.31393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855018.31434: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855018.31594: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.31597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.31877: Set connection var ansible_timeout to 10 18823 1726855018.31880: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855018.31882: Set connection var ansible_shell_type to sh 18823 1726855018.31884: Set connection var ansible_shell_executable to /bin/sh 18823 1726855018.31888: Set connection var ansible_connection to ssh 18823 1726855018.31892: Set connection var ansible_pipelining to False 18823 1726855018.31894: variable 'ansible_shell_executable' from source: unknown 18823 1726855018.31897: variable 'ansible_connection' from source: unknown 18823 1726855018.31899: variable 'ansible_module_compression' from source: unknown 18823 1726855018.31901: variable 'ansible_shell_type' from source: unknown 18823 1726855018.31906: variable 'ansible_shell_executable' from source: unknown 18823 1726855018.31908: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.31910: variable 'ansible_pipelining' from source: unknown 18823 1726855018.31912: variable 'ansible_timeout' from source: unknown 18823 1726855018.31914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.32283: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855018.32492: variable 'omit' from source: magic vars 18823 1726855018.32495: starting attempt loop 18823 1726855018.32499: running the handler 18823 1726855018.32506: _low_level_execute_command(): starting 18823 1726855018.32509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855018.34227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.34595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.36170: stdout chunk (state=3): >>>/root <<< 18823 1726855018.36324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.36368: stderr chunk (state=3): >>><<< 18823 1726855018.36377: stdout chunk (state=3): >>><<< 18823 1726855018.36513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855018.36630: _low_level_execute_command(): starting 18823 1726855018.36635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554 `" && echo ansible-tmp-1726855018.3652287-19326-192935643735554="` echo /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554 `" ) && sleep 0' 18823 1726855018.37869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855018.37884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855018.38008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.38060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855018.38079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.38082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.38229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.40313: stdout chunk (state=3): >>>ansible-tmp-1726855018.3652287-19326-192935643735554=/root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554 <<< 18823 1726855018.40339: stdout chunk (state=3): >>><<< 18823 1726855018.40342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.40344: stderr chunk (state=3): >>><<< 18823 1726855018.40360: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855018.3652287-19326-192935643735554=/root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855018.40417: variable 'ansible_module_compression' from source: unknown 18823 1726855018.40655: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18823 1726855018.40700: variable 'ansible_facts' from source: unknown 18823 1726855018.41095: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/AnsiballZ_stat.py 18823 1726855018.41221: Sending initial data 18823 1726855018.41225: Sent initial data (153 bytes) 18823 1726855018.42625: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855018.42631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855018.42649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855018.42654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.42684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.42794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.42948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.43107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.44693: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18823 1726855018.44712: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18823 1726855018.44727: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855018.44818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855018.44935: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp_2hs11cj /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/AnsiballZ_stat.py <<< 18823 1726855018.44939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/AnsiballZ_stat.py" <<< 18823 1726855018.44950: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp_2hs11cj" to remote "/root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/AnsiballZ_stat.py" <<< 18823 1726855018.46299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.46318: stderr chunk (state=3): >>><<< 18823 1726855018.46326: stdout chunk (state=3): >>><<< 18823 1726855018.46386: done transferring module to remote 18823 1726855018.46569: _low_level_execute_command(): starting 18823 1726855018.46573: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/ /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/AnsiballZ_stat.py && sleep 0' 18823 1726855018.47653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855018.47663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855018.47710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855018.47723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.47901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855018.47993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.48013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.49812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.50042: stderr chunk (state=3): >>><<< 18823 1726855018.50046: stdout chunk (state=3): >>><<< 18823 1726855018.50048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855018.50051: _low_level_execute_command(): starting 18823 1726855018.50053: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/AnsiballZ_stat.py && sleep 0' 18823 1726855018.51258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855018.51276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855018.51295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855018.51402: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.51577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.51711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.51785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.66898: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29408, "dev": 23, "nlink": 1, "atime": 1726855016.592265, "mtime": 1726855016.592265, "ctime": 1726855016.592265, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18823 1726855018.68529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855018.68534: stdout chunk (state=3): >>><<< 18823 1726855018.68536: stderr chunk (state=3): >>><<< 18823 1726855018.68538: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29408, "dev": 23, "nlink": 1, "atime": 1726855016.592265, "mtime": 1726855016.592265, "ctime": 1726855016.592265, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855018.68548: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855018.68565: _low_level_execute_command(): starting 18823 1726855018.68574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855018.3652287-19326-192935643735554/ > /dev/null 2>&1 && sleep 0' 18823 1726855018.70013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855018.70028: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855018.70039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.70054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855018.70061: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855018.70067: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855018.70075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855018.70160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.70292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855018.70315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.70318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.70502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.72495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.72499: stdout chunk (state=3): >>><<< 18823 1726855018.72502: stderr chunk (state=3): >>><<< 18823 1726855018.72505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855018.72507: handler run complete 18823 1726855018.72694: attempt loop complete, returning result 18823 1726855018.72698: _execute() done 18823 1726855018.72700: dumping result to json 18823 1726855018.72703: done dumping result, returning 18823 1726855018.72706: done running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 [0affcc66-ac2b-d391-077c-00000000021e] 18823 1726855018.72708: sending task result for task 0affcc66-ac2b-d391-077c-00000000021e 18823 1726855018.72894: done sending task result for task 0affcc66-ac2b-d391-077c-00000000021e ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726855016.592265, "block_size": 4096, "blocks": 0, "ctime": 1726855016.592265, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29408, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1726855016.592265, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 18823 1726855018.73357: no more pending results, returning what we have 18823 1726855018.73360: results queue empty 18823 1726855018.73361: checking for any_errors_fatal 18823 1726855018.73362: done checking for any_errors_fatal 18823 1726855018.73363: checking for max_fail_percentage 18823 1726855018.73364: done checking for max_fail_percentage 18823 1726855018.73365: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.73365: done checking to see if all hosts have failed 18823 1726855018.73366: getting the remaining hosts for this loop 18823 1726855018.73367: done getting the remaining hosts for this loop 18823 1726855018.73370: getting the next task for host managed_node2 18823 1726855018.73376: done getting next task for host managed_node2 18823 1726855018.73378: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 18823 1726855018.73381: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.73384: getting variables 18823 1726855018.73385: in VariableManager get_vars() 18823 1726855018.73414: Calling all_inventory to load vars for managed_node2 18823 1726855018.73417: Calling groups_inventory to load vars for managed_node2 18823 1726855018.73419: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.73429: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.73431: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.73434: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.73757: WORKER PROCESS EXITING 18823 1726855018.73789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.74233: done with get_vars() 18823 1726855018.74244: done getting variables 18823 1726855018.74365: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18823 1726855018.74490: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:56:58 -0400 (0:00:00.469) 0:00:10.398 ****** 18823 1726855018.74723: entering _queue_task() for managed_node2/assert 18823 1726855018.74725: Creating lock for assert 18823 1726855018.75493: worker is 1 (out of 1 available) 18823 1726855018.75508: exiting _queue_task() for managed_node2/assert 18823 1726855018.75521: done queuing things up, now waiting for results queue to drain 18823 1726855018.75522: waiting for pending results... 18823 1726855018.75858: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr27' 18823 1726855018.75954: in run() - task 0affcc66-ac2b-d391-077c-0000000001d4 18823 1726855018.75966: variable 'ansible_search_path' from source: unknown 18823 1726855018.75969: variable 'ansible_search_path' from source: unknown 18823 1726855018.76008: calling self._execute() 18823 1726855018.76085: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.76090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.76102: variable 'omit' from source: magic vars 18823 1726855018.76476: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.76489: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.76496: variable 'omit' from source: magic vars 18823 1726855018.76538: variable 'omit' from source: magic vars 18823 1726855018.76636: variable 'interface' from source: set_fact 18823 1726855018.76652: variable 'omit' from source: magic vars 18823 1726855018.76699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855018.76737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855018.76756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855018.76778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855018.76797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855018.76822: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855018.76826: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.76828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.76953: Set connection var ansible_timeout to 10 18823 1726855018.76969: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855018.76973: Set connection var ansible_shell_type to sh 18823 1726855018.76976: Set connection var ansible_shell_executable to /bin/sh 18823 1726855018.76978: Set connection var ansible_connection to ssh 18823 1726855018.76981: Set connection var ansible_pipelining to False 18823 1726855018.77152: variable 'ansible_shell_executable' from source: unknown 18823 1726855018.77156: variable 'ansible_connection' from source: unknown 18823 1726855018.77159: variable 'ansible_module_compression' from source: unknown 18823 1726855018.77161: variable 'ansible_shell_type' from source: unknown 18823 1726855018.77163: variable 'ansible_shell_executable' from source: unknown 18823 1726855018.77165: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.77168: variable 'ansible_pipelining' from source: unknown 18823 1726855018.77170: variable 'ansible_timeout' from source: unknown 18823 1726855018.77191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.77420: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855018.77494: variable 'omit' from source: magic vars 18823 1726855018.77497: starting attempt loop 18823 1726855018.77500: running the handler 18823 1726855018.77633: variable 'interface_stat' from source: set_fact 18823 1726855018.77663: Evaluated conditional (interface_stat.stat.exists): True 18823 1726855018.77672: handler run complete 18823 1726855018.77692: attempt loop complete, returning result 18823 1726855018.77699: _execute() done 18823 1726855018.77705: dumping result to json 18823 1726855018.77712: done dumping result, returning 18823 1726855018.77722: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr27' [0affcc66-ac2b-d391-077c-0000000001d4] 18823 1726855018.77764: sending task result for task 0affcc66-ac2b-d391-077c-0000000001d4 18823 1726855018.78001: done sending task result for task 0affcc66-ac2b-d391-077c-0000000001d4 18823 1726855018.78003: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18823 1726855018.78053: no more pending results, returning what we have 18823 1726855018.78081: results queue empty 18823 1726855018.78083: checking for any_errors_fatal 18823 1726855018.78096: done checking for any_errors_fatal 18823 1726855018.78097: checking for max_fail_percentage 18823 1726855018.78099: done checking for max_fail_percentage 18823 1726855018.78100: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.78101: done checking to see if all hosts have failed 18823 1726855018.78101: getting the remaining hosts for this loop 18823 1726855018.78103: done getting the remaining hosts for this loop 18823 1726855018.78107: getting the next task for host managed_node2 18823 1726855018.78116: done getting next task for host managed_node2 18823 1726855018.78118: ^ task is: TASK: meta (flush_handlers) 18823 1726855018.78120: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.78125: getting variables 18823 1726855018.78126: in VariableManager get_vars() 18823 1726855018.78155: Calling all_inventory to load vars for managed_node2 18823 1726855018.78158: Calling groups_inventory to load vars for managed_node2 18823 1726855018.78161: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.78289: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.78294: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.78297: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.78612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.78907: done with get_vars() 18823 1726855018.78922: done getting variables 18823 1726855018.78999: in VariableManager get_vars() 18823 1726855018.79008: Calling all_inventory to load vars for managed_node2 18823 1726855018.79009: Calling groups_inventory to load vars for managed_node2 18823 1726855018.79011: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.79014: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.79016: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.79017: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.79102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.79210: done with get_vars() 18823 1726855018.79219: done queuing things up, now waiting for results queue to drain 18823 1726855018.79220: results queue empty 18823 1726855018.79221: checking for any_errors_fatal 18823 1726855018.79222: done checking for any_errors_fatal 18823 1726855018.79223: checking for max_fail_percentage 18823 1726855018.79223: done checking for max_fail_percentage 18823 1726855018.79224: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.79224: done checking to see if all hosts have failed 18823 1726855018.79229: getting the remaining hosts for this loop 18823 1726855018.79229: done getting the remaining hosts for this loop 18823 1726855018.79231: getting the next task for host managed_node2 18823 1726855018.79233: done getting next task for host managed_node2 18823 1726855018.79234: ^ task is: TASK: meta (flush_handlers) 18823 1726855018.79235: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.79237: getting variables 18823 1726855018.79237: in VariableManager get_vars() 18823 1726855018.79242: Calling all_inventory to load vars for managed_node2 18823 1726855018.79243: Calling groups_inventory to load vars for managed_node2 18823 1726855018.79244: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.79247: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.79248: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.79250: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.79380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.79580: done with get_vars() 18823 1726855018.79590: done getting variables 18823 1726855018.79637: in VariableManager get_vars() 18823 1726855018.79645: Calling all_inventory to load vars for managed_node2 18823 1726855018.79647: Calling groups_inventory to load vars for managed_node2 18823 1726855018.79649: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.79653: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.79655: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.79657: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.79786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.79970: done with get_vars() 18823 1726855018.79980: done queuing things up, now waiting for results queue to drain 18823 1726855018.79983: results queue empty 18823 1726855018.79983: checking for any_errors_fatal 18823 1726855018.79985: done checking for any_errors_fatal 18823 1726855018.79985: checking for max_fail_percentage 18823 1726855018.79986: done checking for max_fail_percentage 18823 1726855018.79989: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.79990: done checking to see if all hosts have failed 18823 1726855018.79990: getting the remaining hosts for this loop 18823 1726855018.79991: done getting the remaining hosts for this loop 18823 1726855018.79993: getting the next task for host managed_node2 18823 1726855018.79996: done getting next task for host managed_node2 18823 1726855018.79997: ^ task is: None 18823 1726855018.79998: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.79999: done queuing things up, now waiting for results queue to drain 18823 1726855018.80000: results queue empty 18823 1726855018.80001: checking for any_errors_fatal 18823 1726855018.80002: done checking for any_errors_fatal 18823 1726855018.80005: checking for max_fail_percentage 18823 1726855018.80006: done checking for max_fail_percentage 18823 1726855018.80007: checking to see if all hosts have failed and the running result is not ok 18823 1726855018.80007: done checking to see if all hosts have failed 18823 1726855018.80009: getting the next task for host managed_node2 18823 1726855018.80011: done getting next task for host managed_node2 18823 1726855018.80012: ^ task is: None 18823 1726855018.80013: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.80064: in VariableManager get_vars() 18823 1726855018.80083: done with get_vars() 18823 1726855018.80092: in VariableManager get_vars() 18823 1726855018.80110: done with get_vars() 18823 1726855018.80115: variable 'omit' from source: magic vars 18823 1726855018.80141: in VariableManager get_vars() 18823 1726855018.80150: done with get_vars() 18823 1726855018.80165: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 18823 1726855018.80726: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855018.80750: getting the remaining hosts for this loop 18823 1726855018.80751: done getting the remaining hosts for this loop 18823 1726855018.80753: getting the next task for host managed_node2 18823 1726855018.80756: done getting next task for host managed_node2 18823 1726855018.80758: ^ task is: TASK: Gathering Facts 18823 1726855018.80759: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855018.80761: getting variables 18823 1726855018.80762: in VariableManager get_vars() 18823 1726855018.80815: Calling all_inventory to load vars for managed_node2 18823 1726855018.80817: Calling groups_inventory to load vars for managed_node2 18823 1726855018.80819: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855018.80824: Calling all_plugins_play to load vars for managed_node2 18823 1726855018.80826: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855018.80828: Calling groups_plugins_play to load vars for managed_node2 18823 1726855018.80960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855018.81148: done with get_vars() 18823 1726855018.81156: done getting variables 18823 1726855018.81199: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 13:56:58 -0400 (0:00:00.064) 0:00:10.463 ****** 18823 1726855018.81221: entering _queue_task() for managed_node2/gather_facts 18823 1726855018.81505: worker is 1 (out of 1 available) 18823 1726855018.81517: exiting _queue_task() for managed_node2/gather_facts 18823 1726855018.81529: done queuing things up, now waiting for results queue to drain 18823 1726855018.81530: waiting for pending results... 18823 1726855018.81955: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855018.82031: in run() - task 0affcc66-ac2b-d391-077c-000000000237 18823 1726855018.82070: variable 'ansible_search_path' from source: unknown 18823 1726855018.82079: calling self._execute() 18823 1726855018.82171: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.82178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.82205: variable 'omit' from source: magic vars 18823 1726855018.82566: variable 'ansible_distribution_major_version' from source: facts 18823 1726855018.82570: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855018.82572: variable 'omit' from source: magic vars 18823 1726855018.82576: variable 'omit' from source: magic vars 18823 1726855018.82597: variable 'omit' from source: magic vars 18823 1726855018.82631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855018.82659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855018.82751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855018.82754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855018.82757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855018.82779: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855018.82784: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.82786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.82849: Set connection var ansible_timeout to 10 18823 1726855018.82855: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855018.82858: Set connection var ansible_shell_type to sh 18823 1726855018.82863: Set connection var ansible_shell_executable to /bin/sh 18823 1726855018.82868: Set connection var ansible_connection to ssh 18823 1726855018.82872: Set connection var ansible_pipelining to False 18823 1726855018.82894: variable 'ansible_shell_executable' from source: unknown 18823 1726855018.82897: variable 'ansible_connection' from source: unknown 18823 1726855018.82900: variable 'ansible_module_compression' from source: unknown 18823 1726855018.82902: variable 'ansible_shell_type' from source: unknown 18823 1726855018.82904: variable 'ansible_shell_executable' from source: unknown 18823 1726855018.82909: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855018.82912: variable 'ansible_pipelining' from source: unknown 18823 1726855018.82915: variable 'ansible_timeout' from source: unknown 18823 1726855018.82920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855018.83055: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855018.83062: variable 'omit' from source: magic vars 18823 1726855018.83068: starting attempt loop 18823 1726855018.83071: running the handler 18823 1726855018.83083: variable 'ansible_facts' from source: unknown 18823 1726855018.83102: _low_level_execute_command(): starting 18823 1726855018.83112: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855018.83986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855018.84308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.84319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.85934: stdout chunk (state=3): >>>/root <<< 18823 1726855018.86156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.86193: stdout chunk (state=3): >>><<< 18823 1726855018.86198: stderr chunk (state=3): >>><<< 18823 1726855018.86339: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855018.86343: _low_level_execute_command(): starting 18823 1726855018.86347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429 `" && echo ansible-tmp-1726855018.8622694-19354-109187977081429="` echo /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429 `" ) && sleep 0' 18823 1726855018.86933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855018.87038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.87074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.87185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.89215: stdout chunk (state=3): >>>ansible-tmp-1726855018.8622694-19354-109187977081429=/root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429 <<< 18823 1726855018.89333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.89378: stderr chunk (state=3): >>><<< 18823 1726855018.89437: stdout chunk (state=3): >>><<< 18823 1726855018.89489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855018.8622694-19354-109187977081429=/root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855018.89625: variable 'ansible_module_compression' from source: unknown 18823 1726855018.89792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855018.89795: variable 'ansible_facts' from source: unknown 18823 1726855018.90001: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/AnsiballZ_setup.py 18823 1726855018.90134: Sending initial data 18823 1726855018.90238: Sent initial data (154 bytes) 18823 1726855018.90828: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855018.90842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855018.90861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855018.90889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855018.91004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.91020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.91123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.92683: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18823 1726855018.92719: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18823 1726855018.92740: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855018.92865: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855018.92913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp87i7cc40 /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/AnsiballZ_setup.py <<< 18823 1726855018.92947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/AnsiballZ_setup.py" <<< 18823 1726855018.93044: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp87i7cc40" to remote "/root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/AnsiballZ_setup.py" <<< 18823 1726855018.95045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.95050: stdout chunk (state=3): >>><<< 18823 1726855018.95053: stderr chunk (state=3): >>><<< 18823 1726855018.95055: done transferring module to remote 18823 1726855018.95057: _low_level_execute_command(): starting 18823 1726855018.95059: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/ /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/AnsiballZ_setup.py && sleep 0' 18823 1726855018.95816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.96037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.96156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855018.96286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855018.98419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855018.98423: stdout chunk (state=3): >>><<< 18823 1726855018.98425: stderr chunk (state=3): >>><<< 18823 1726855018.98428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855018.98434: _low_level_execute_command(): starting 18823 1726855018.98437: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/AnsiballZ_setup.py && sleep 0' 18823 1726855018.99825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855018.99896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855018.99919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855018.99957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855019.00180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855019.65949: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_loadavg": {"1m": 0.646484375, "5m": 0.4248046875, "15m": 0.21435546875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "59", "epoch": "1726855019", "epoch_int": "1726855019", "date": "2024-09-20", "time": "13:56:59", "iso8601_micro": "2024-09-20T17:56:59.274910Z", "iso8601": "2024-09-20T17:56:59Z", "iso8601_basic": "20240920T135659274910", "iso8601_basic_short": "20240920T135659", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 802, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794971648, "block_size": 4096, "block_total": 65519099, "block_available": 63914788, "block_used": 1604311, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo", "lsr27", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_g<<< 18823 1726855019.66015: stdout chunk (state=3): >>>so_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"]}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855019.68061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855019.68066: stdout chunk (state=3): >>><<< 18823 1726855019.68068: stderr chunk (state=3): >>><<< 18823 1726855019.68309: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_loadavg": {"1m": 0.646484375, "5m": 0.4248046875, "15m": 0.21435546875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "59", "epoch": "1726855019", "epoch_int": "1726855019", "date": "2024-09-20", "time": "13:56:59", "iso8601_micro": "2024-09-20T17:56:59.274910Z", "iso8601": "2024-09-20T17:56:59Z", "iso8601_basic": "20240920T135659274910", "iso8601_basic_short": "20240920T135659", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 802, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794971648, "block_size": 4096, "block_total": 65519099, "block_available": 63914788, "block_used": 1604311, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo", "lsr27", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"]}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855019.68932: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855019.68948: _low_level_execute_command(): starting 18823 1726855019.68952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855018.8622694-19354-109187977081429/ > /dev/null 2>&1 && sleep 0' 18823 1726855019.69695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855019.69841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855019.71640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855019.71671: stderr chunk (state=3): >>><<< 18823 1726855019.71677: stdout chunk (state=3): >>><<< 18823 1726855019.71695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855019.71704: handler run complete 18823 1726855019.71799: variable 'ansible_facts' from source: unknown 18823 1726855019.71867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.72108: variable 'ansible_facts' from source: unknown 18823 1726855019.72166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.72258: attempt loop complete, returning result 18823 1726855019.72261: _execute() done 18823 1726855019.72266: dumping result to json 18823 1726855019.72291: done dumping result, returning 18823 1726855019.72299: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-000000000237] 18823 1726855019.72301: sending task result for task 0affcc66-ac2b-d391-077c-000000000237 18823 1726855019.72849: done sending task result for task 0affcc66-ac2b-d391-077c-000000000237 18823 1726855019.72852: WORKER PROCESS EXITING ok: [managed_node2] 18823 1726855019.73026: no more pending results, returning what we have 18823 1726855019.73029: results queue empty 18823 1726855019.73030: checking for any_errors_fatal 18823 1726855019.73030: done checking for any_errors_fatal 18823 1726855019.73031: checking for max_fail_percentage 18823 1726855019.73032: done checking for max_fail_percentage 18823 1726855019.73032: checking to see if all hosts have failed and the running result is not ok 18823 1726855019.73033: done checking to see if all hosts have failed 18823 1726855019.73033: getting the remaining hosts for this loop 18823 1726855019.73034: done getting the remaining hosts for this loop 18823 1726855019.73037: getting the next task for host managed_node2 18823 1726855019.73040: done getting next task for host managed_node2 18823 1726855019.73041: ^ task is: TASK: meta (flush_handlers) 18823 1726855019.73043: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855019.73045: getting variables 18823 1726855019.73046: in VariableManager get_vars() 18823 1726855019.73065: Calling all_inventory to load vars for managed_node2 18823 1726855019.73068: Calling groups_inventory to load vars for managed_node2 18823 1726855019.73070: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.73082: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.73084: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.73085: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.73245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.73474: done with get_vars() 18823 1726855019.73484: done getting variables 18823 1726855019.73552: in VariableManager get_vars() 18823 1726855019.73564: Calling all_inventory to load vars for managed_node2 18823 1726855019.73566: Calling groups_inventory to load vars for managed_node2 18823 1726855019.73568: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.73572: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.73574: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.73577: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.73730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.73945: done with get_vars() 18823 1726855019.73958: done queuing things up, now waiting for results queue to drain 18823 1726855019.73960: results queue empty 18823 1726855019.73961: checking for any_errors_fatal 18823 1726855019.73964: done checking for any_errors_fatal 18823 1726855019.73964: checking for max_fail_percentage 18823 1726855019.73965: done checking for max_fail_percentage 18823 1726855019.73966: checking to see if all hosts have failed and the running result is not ok 18823 1726855019.73967: done checking to see if all hosts have failed 18823 1726855019.73967: getting the remaining hosts for this loop 18823 1726855019.73968: done getting the remaining hosts for this loop 18823 1726855019.73975: getting the next task for host managed_node2 18823 1726855019.73979: done getting next task for host managed_node2 18823 1726855019.73982: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18823 1726855019.73983: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855019.73994: getting variables 18823 1726855019.73996: in VariableManager get_vars() 18823 1726855019.74008: Calling all_inventory to load vars for managed_node2 18823 1726855019.74010: Calling groups_inventory to load vars for managed_node2 18823 1726855019.74012: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.74016: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.74018: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.74021: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.74172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.74412: done with get_vars() 18823 1726855019.74420: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:56:59 -0400 (0:00:00.932) 0:00:11.396 ****** 18823 1726855019.74490: entering _queue_task() for managed_node2/include_tasks 18823 1726855019.74775: worker is 1 (out of 1 available) 18823 1726855019.74789: exiting _queue_task() for managed_node2/include_tasks 18823 1726855019.74801: done queuing things up, now waiting for results queue to drain 18823 1726855019.74802: waiting for pending results... 18823 1726855019.75208: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18823 1726855019.75213: in run() - task 0affcc66-ac2b-d391-077c-000000000019 18823 1726855019.75215: variable 'ansible_search_path' from source: unknown 18823 1726855019.75218: variable 'ansible_search_path' from source: unknown 18823 1726855019.75220: calling self._execute() 18823 1726855019.75296: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855019.75312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855019.75328: variable 'omit' from source: magic vars 18823 1726855019.75692: variable 'ansible_distribution_major_version' from source: facts 18823 1726855019.75710: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855019.75720: _execute() done 18823 1726855019.75728: dumping result to json 18823 1726855019.75743: done dumping result, returning 18823 1726855019.75756: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-d391-077c-000000000019] 18823 1726855019.75766: sending task result for task 0affcc66-ac2b-d391-077c-000000000019 18823 1726855019.75907: no more pending results, returning what we have 18823 1726855019.75912: in VariableManager get_vars() 18823 1726855019.75952: Calling all_inventory to load vars for managed_node2 18823 1726855019.75955: Calling groups_inventory to load vars for managed_node2 18823 1726855019.75957: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.75970: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.75973: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.75976: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.76304: done sending task result for task 0affcc66-ac2b-d391-077c-000000000019 18823 1726855019.76308: WORKER PROCESS EXITING 18823 1726855019.76328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.76574: done with get_vars() 18823 1726855019.76582: variable 'ansible_search_path' from source: unknown 18823 1726855019.76583: variable 'ansible_search_path' from source: unknown 18823 1726855019.76616: we have included files to process 18823 1726855019.76617: generating all_blocks data 18823 1726855019.76619: done generating all_blocks data 18823 1726855019.76620: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855019.76625: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855019.76627: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855019.77414: done processing included file 18823 1726855019.77416: iterating over new_blocks loaded from include file 18823 1726855019.77417: in VariableManager get_vars() 18823 1726855019.77436: done with get_vars() 18823 1726855019.77438: filtering new block on tags 18823 1726855019.77453: done filtering new block on tags 18823 1726855019.77456: in VariableManager get_vars() 18823 1726855019.77479: done with get_vars() 18823 1726855019.77481: filtering new block on tags 18823 1726855019.77507: done filtering new block on tags 18823 1726855019.77510: in VariableManager get_vars() 18823 1726855019.77529: done with get_vars() 18823 1726855019.77531: filtering new block on tags 18823 1726855019.77547: done filtering new block on tags 18823 1726855019.77549: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 18823 1726855019.77554: extending task lists for all hosts with included blocks 18823 1726855019.77984: done extending task lists 18823 1726855019.77986: done processing included files 18823 1726855019.77986: results queue empty 18823 1726855019.77989: checking for any_errors_fatal 18823 1726855019.77991: done checking for any_errors_fatal 18823 1726855019.77991: checking for max_fail_percentage 18823 1726855019.77992: done checking for max_fail_percentage 18823 1726855019.77993: checking to see if all hosts have failed and the running result is not ok 18823 1726855019.77994: done checking to see if all hosts have failed 18823 1726855019.77995: getting the remaining hosts for this loop 18823 1726855019.77996: done getting the remaining hosts for this loop 18823 1726855019.77999: getting the next task for host managed_node2 18823 1726855019.78009: done getting next task for host managed_node2 18823 1726855019.78012: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18823 1726855019.78014: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855019.78023: getting variables 18823 1726855019.78024: in VariableManager get_vars() 18823 1726855019.78041: Calling all_inventory to load vars for managed_node2 18823 1726855019.78044: Calling groups_inventory to load vars for managed_node2 18823 1726855019.78046: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.78050: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.78053: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.78055: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.78235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.78484: done with get_vars() 18823 1726855019.78494: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:56:59 -0400 (0:00:00.040) 0:00:11.437 ****** 18823 1726855019.78567: entering _queue_task() for managed_node2/setup 18823 1726855019.78868: worker is 1 (out of 1 available) 18823 1726855019.78998: exiting _queue_task() for managed_node2/setup 18823 1726855019.79011: done queuing things up, now waiting for results queue to drain 18823 1726855019.79012: waiting for pending results... 18823 1726855019.79173: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18823 1726855019.79325: in run() - task 0affcc66-ac2b-d391-077c-000000000279 18823 1726855019.79347: variable 'ansible_search_path' from source: unknown 18823 1726855019.79355: variable 'ansible_search_path' from source: unknown 18823 1726855019.79396: calling self._execute() 18823 1726855019.79494: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855019.79508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855019.79522: variable 'omit' from source: magic vars 18823 1726855019.79920: variable 'ansible_distribution_major_version' from source: facts 18823 1726855019.79935: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855019.80162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855019.82531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855019.82621: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855019.82658: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855019.82710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855019.82739: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855019.82830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855019.82858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855019.82883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855019.82937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855019.82954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855019.83027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855019.83055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855019.83107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855019.83301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855019.83307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855019.83357: variable '__network_required_facts' from source: role '' defaults 18823 1726855019.83372: variable 'ansible_facts' from source: unknown 18823 1726855019.83493: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18823 1726855019.83505: when evaluation is False, skipping this task 18823 1726855019.83518: _execute() done 18823 1726855019.83534: dumping result to json 18823 1726855019.83543: done dumping result, returning 18823 1726855019.83556: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-d391-077c-000000000279] 18823 1726855019.83566: sending task result for task 0affcc66-ac2b-d391-077c-000000000279 18823 1726855019.83786: done sending task result for task 0affcc66-ac2b-d391-077c-000000000279 18823 1726855019.83791: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855019.83844: no more pending results, returning what we have 18823 1726855019.83847: results queue empty 18823 1726855019.83849: checking for any_errors_fatal 18823 1726855019.83851: done checking for any_errors_fatal 18823 1726855019.83851: checking for max_fail_percentage 18823 1726855019.83853: done checking for max_fail_percentage 18823 1726855019.83854: checking to see if all hosts have failed and the running result is not ok 18823 1726855019.83855: done checking to see if all hosts have failed 18823 1726855019.83855: getting the remaining hosts for this loop 18823 1726855019.83857: done getting the remaining hosts for this loop 18823 1726855019.83861: getting the next task for host managed_node2 18823 1726855019.83870: done getting next task for host managed_node2 18823 1726855019.83874: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18823 1726855019.83877: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855019.83894: getting variables 18823 1726855019.83896: in VariableManager get_vars() 18823 1726855019.83941: Calling all_inventory to load vars for managed_node2 18823 1726855019.83944: Calling groups_inventory to load vars for managed_node2 18823 1726855019.83947: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.83959: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.83962: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.83965: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.84464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.84706: done with get_vars() 18823 1726855019.84719: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:56:59 -0400 (0:00:00.062) 0:00:11.500 ****** 18823 1726855019.84846: entering _queue_task() for managed_node2/stat 18823 1726855019.85363: worker is 1 (out of 1 available) 18823 1726855019.85376: exiting _queue_task() for managed_node2/stat 18823 1726855019.85389: done queuing things up, now waiting for results queue to drain 18823 1726855019.85390: waiting for pending results... 18823 1726855019.86108: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 18823 1726855019.86114: in run() - task 0affcc66-ac2b-d391-077c-00000000027b 18823 1726855019.86117: variable 'ansible_search_path' from source: unknown 18823 1726855019.86120: variable 'ansible_search_path' from source: unknown 18823 1726855019.86122: calling self._execute() 18823 1726855019.86263: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855019.86268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855019.86271: variable 'omit' from source: magic vars 18823 1726855019.86795: variable 'ansible_distribution_major_version' from source: facts 18823 1726855019.86799: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855019.86982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855019.87592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855019.87623: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855019.87664: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855019.87713: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855019.87885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855019.87891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855019.87893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855019.87918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855019.88155: variable '__network_is_ostree' from source: set_fact 18823 1726855019.88206: Evaluated conditional (not __network_is_ostree is defined): False 18823 1726855019.88348: when evaluation is False, skipping this task 18823 1726855019.88351: _execute() done 18823 1726855019.88354: dumping result to json 18823 1726855019.88356: done dumping result, returning 18823 1726855019.88359: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-d391-077c-00000000027b] 18823 1726855019.88361: sending task result for task 0affcc66-ac2b-d391-077c-00000000027b 18823 1726855019.88486: done sending task result for task 0affcc66-ac2b-d391-077c-00000000027b 18823 1726855019.88492: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18823 1726855019.88549: no more pending results, returning what we have 18823 1726855019.88553: results queue empty 18823 1726855019.88554: checking for any_errors_fatal 18823 1726855019.88560: done checking for any_errors_fatal 18823 1726855019.88561: checking for max_fail_percentage 18823 1726855019.88563: done checking for max_fail_percentage 18823 1726855019.88564: checking to see if all hosts have failed and the running result is not ok 18823 1726855019.88565: done checking to see if all hosts have failed 18823 1726855019.88565: getting the remaining hosts for this loop 18823 1726855019.88567: done getting the remaining hosts for this loop 18823 1726855019.88571: getting the next task for host managed_node2 18823 1726855019.88578: done getting next task for host managed_node2 18823 1726855019.88582: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18823 1726855019.88585: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855019.88810: getting variables 18823 1726855019.88812: in VariableManager get_vars() 18823 1726855019.88851: Calling all_inventory to load vars for managed_node2 18823 1726855019.88855: Calling groups_inventory to load vars for managed_node2 18823 1726855019.88858: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.88869: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.88873: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.88876: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.90072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.90321: done with get_vars() 18823 1726855019.90334: done getting variables 18823 1726855019.90399: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:56:59 -0400 (0:00:00.055) 0:00:11.556 ****** 18823 1726855019.90432: entering _queue_task() for managed_node2/set_fact 18823 1726855019.90828: worker is 1 (out of 1 available) 18823 1726855019.90838: exiting _queue_task() for managed_node2/set_fact 18823 1726855019.90848: done queuing things up, now waiting for results queue to drain 18823 1726855019.90849: waiting for pending results... 18823 1726855019.91200: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18823 1726855019.91205: in run() - task 0affcc66-ac2b-d391-077c-00000000027c 18823 1726855019.91208: variable 'ansible_search_path' from source: unknown 18823 1726855019.91210: variable 'ansible_search_path' from source: unknown 18823 1726855019.91213: calling self._execute() 18823 1726855019.91311: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855019.91336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855019.91419: variable 'omit' from source: magic vars 18823 1726855019.92297: variable 'ansible_distribution_major_version' from source: facts 18823 1726855019.92300: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855019.92694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855019.93062: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855019.93235: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855019.93269: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855019.93454: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855019.93575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855019.93660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855019.93721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855019.93895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855019.94016: variable '__network_is_ostree' from source: set_fact 18823 1726855019.94027: Evaluated conditional (not __network_is_ostree is defined): False 18823 1726855019.94034: when evaluation is False, skipping this task 18823 1726855019.94041: _execute() done 18823 1726855019.94070: dumping result to json 18823 1726855019.94079: done dumping result, returning 18823 1726855019.94094: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-d391-077c-00000000027c] 18823 1726855019.94118: sending task result for task 0affcc66-ac2b-d391-077c-00000000027c 18823 1726855019.94506: done sending task result for task 0affcc66-ac2b-d391-077c-00000000027c 18823 1726855019.94509: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18823 1726855019.94555: no more pending results, returning what we have 18823 1726855019.94558: results queue empty 18823 1726855019.94559: checking for any_errors_fatal 18823 1726855019.94563: done checking for any_errors_fatal 18823 1726855019.94564: checking for max_fail_percentage 18823 1726855019.94566: done checking for max_fail_percentage 18823 1726855019.94567: checking to see if all hosts have failed and the running result is not ok 18823 1726855019.94568: done checking to see if all hosts have failed 18823 1726855019.94568: getting the remaining hosts for this loop 18823 1726855019.94570: done getting the remaining hosts for this loop 18823 1726855019.94573: getting the next task for host managed_node2 18823 1726855019.94580: done getting next task for host managed_node2 18823 1726855019.94584: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18823 1726855019.94589: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855019.94602: getting variables 18823 1726855019.94604: in VariableManager get_vars() 18823 1726855019.94638: Calling all_inventory to load vars for managed_node2 18823 1726855019.94641: Calling groups_inventory to load vars for managed_node2 18823 1726855019.94643: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855019.94652: Calling all_plugins_play to load vars for managed_node2 18823 1726855019.94655: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855019.94658: Calling groups_plugins_play to load vars for managed_node2 18823 1726855019.94954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855019.95227: done with get_vars() 18823 1726855019.95242: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:56:59 -0400 (0:00:00.048) 0:00:11.605 ****** 18823 1726855019.95334: entering _queue_task() for managed_node2/service_facts 18823 1726855019.95336: Creating lock for service_facts 18823 1726855019.95713: worker is 1 (out of 1 available) 18823 1726855019.95725: exiting _queue_task() for managed_node2/service_facts 18823 1726855019.95736: done queuing things up, now waiting for results queue to drain 18823 1726855019.95737: waiting for pending results... 18823 1726855019.96156: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 18823 1726855019.96594: in run() - task 0affcc66-ac2b-d391-077c-00000000027e 18823 1726855019.96598: variable 'ansible_search_path' from source: unknown 18823 1726855019.96601: variable 'ansible_search_path' from source: unknown 18823 1726855019.96604: calling self._execute() 18823 1726855019.96901: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855019.96905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855019.96908: variable 'omit' from source: magic vars 18823 1726855019.97348: variable 'ansible_distribution_major_version' from source: facts 18823 1726855019.97366: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855019.97378: variable 'omit' from source: magic vars 18823 1726855019.97451: variable 'omit' from source: magic vars 18823 1726855019.97496: variable 'omit' from source: magic vars 18823 1726855019.97542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855019.97592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855019.97620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855019.97642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855019.97662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855019.97704: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855019.97713: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855019.97720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855019.97831: Set connection var ansible_timeout to 10 18823 1726855019.97845: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855019.97852: Set connection var ansible_shell_type to sh 18823 1726855019.97863: Set connection var ansible_shell_executable to /bin/sh 18823 1726855019.97894: Set connection var ansible_connection to ssh 18823 1726855019.97897: Set connection var ansible_pipelining to False 18823 1726855019.97924: variable 'ansible_shell_executable' from source: unknown 18823 1726855019.97986: variable 'ansible_connection' from source: unknown 18823 1726855019.97992: variable 'ansible_module_compression' from source: unknown 18823 1726855019.97994: variable 'ansible_shell_type' from source: unknown 18823 1726855019.97998: variable 'ansible_shell_executable' from source: unknown 18823 1726855019.98003: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855019.98006: variable 'ansible_pipelining' from source: unknown 18823 1726855019.98007: variable 'ansible_timeout' from source: unknown 18823 1726855019.98009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855019.98175: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855019.98194: variable 'omit' from source: magic vars 18823 1726855019.98210: starting attempt loop 18823 1726855019.98221: running the handler 18823 1726855019.98242: _low_level_execute_command(): starting 18823 1726855019.98312: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855019.99194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855019.99216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855019.99233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855019.99252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855019.99311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855019.99410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855019.99422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855019.99696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855020.01347: stdout chunk (state=3): >>>/root <<< 18823 1726855020.01550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855020.01554: stdout chunk (state=3): >>><<< 18823 1726855020.01557: stderr chunk (state=3): >>><<< 18823 1726855020.01583: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855020.01895: _low_level_execute_command(): starting 18823 1726855020.01901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843 `" && echo ansible-tmp-1726855020.0179534-19413-154914087095843="` echo /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843 `" ) && sleep 0' 18823 1726855020.03100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855020.03415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855020.05495: stdout chunk (state=3): >>>ansible-tmp-1726855020.0179534-19413-154914087095843=/root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843 <<< 18823 1726855020.05500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855020.05530: stderr chunk (state=3): >>><<< 18823 1726855020.05538: stdout chunk (state=3): >>><<< 18823 1726855020.05564: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855020.0179534-19413-154914087095843=/root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855020.05696: variable 'ansible_module_compression' from source: unknown 18823 1726855020.05699: ANSIBALLZ: Using lock for service_facts 18823 1726855020.06191: ANSIBALLZ: Acquiring lock 18823 1726855020.06195: ANSIBALLZ: Lock acquired: 140142263955920 18823 1726855020.06197: ANSIBALLZ: Creating module 18823 1726855020.34374: ANSIBALLZ: Writing module into payload 18823 1726855020.34486: ANSIBALLZ: Writing module 18823 1726855020.34522: ANSIBALLZ: Renaming module 18823 1726855020.34594: ANSIBALLZ: Done creating module 18823 1726855020.34599: variable 'ansible_facts' from source: unknown 18823 1726855020.34633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/AnsiballZ_service_facts.py 18823 1726855020.34815: Sending initial data 18823 1726855020.34823: Sent initial data (162 bytes) 18823 1726855020.35382: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855020.35400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855020.35492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855020.35510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855020.35526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855020.35540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855020.35645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855020.37494: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855020.37503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855020.37578: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpju3klah7 /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/AnsiballZ_service_facts.py <<< 18823 1726855020.37582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/AnsiballZ_service_facts.py" <<< 18823 1726855020.37733: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpju3klah7" to remote "/root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/AnsiballZ_service_facts.py" <<< 18823 1726855020.39079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855020.39142: stderr chunk (state=3): >>><<< 18823 1726855020.39146: stdout chunk (state=3): >>><<< 18823 1726855020.39267: done transferring module to remote 18823 1726855020.39292: _low_level_execute_command(): starting 18823 1726855020.39295: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/ /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/AnsiballZ_service_facts.py && sleep 0' 18823 1726855020.40289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855020.40318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855020.40333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855020.40368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855020.40372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855020.40448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855020.42686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855020.42693: stdout chunk (state=3): >>><<< 18823 1726855020.42695: stderr chunk (state=3): >>><<< 18823 1726855020.42707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855020.42716: _low_level_execute_command(): starting 18823 1726855020.42733: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/AnsiballZ_service_facts.py && sleep 0' 18823 1726855020.43473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855020.43512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855020.43577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855020.43580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855020.43714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855021.98232: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 18823 1726855021.98272: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18823 1726855021.99812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855021.99889: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 18823 1726855021.99894: stdout chunk (state=3): >>><<< 18823 1726855021.99897: stderr chunk (state=3): >>><<< 18823 1726855022.00094: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855022.01965: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855022.01972: _low_level_execute_command(): starting 18823 1726855022.01979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855020.0179534-19413-154914087095843/ > /dev/null 2>&1 && sleep 0' 18823 1726855022.02577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855022.02649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855022.02692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855022.02906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855022.03085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855022.05010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855022.05061: stderr chunk (state=3): >>><<< 18823 1726855022.05064: stdout chunk (state=3): >>><<< 18823 1726855022.05193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855022.05197: handler run complete 18823 1726855022.05293: variable 'ansible_facts' from source: unknown 18823 1726855022.05440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855022.06197: variable 'ansible_facts' from source: unknown 18823 1726855022.06200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855022.06205: attempt loop complete, returning result 18823 1726855022.06207: _execute() done 18823 1726855022.06209: dumping result to json 18823 1726855022.06327: done dumping result, returning 18823 1726855022.06330: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-d391-077c-00000000027e] 18823 1726855022.06333: sending task result for task 0affcc66-ac2b-d391-077c-00000000027e ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855022.07076: no more pending results, returning what we have 18823 1726855022.07079: results queue empty 18823 1726855022.07080: checking for any_errors_fatal 18823 1726855022.07085: done checking for any_errors_fatal 18823 1726855022.07086: checking for max_fail_percentage 18823 1726855022.07090: done checking for max_fail_percentage 18823 1726855022.07092: checking to see if all hosts have failed and the running result is not ok 18823 1726855022.07092: done checking to see if all hosts have failed 18823 1726855022.07093: getting the remaining hosts for this loop 18823 1726855022.07094: done getting the remaining hosts for this loop 18823 1726855022.07098: getting the next task for host managed_node2 18823 1726855022.07109: done getting next task for host managed_node2 18823 1726855022.07113: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18823 1726855022.07116: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855022.07126: getting variables 18823 1726855022.07127: in VariableManager get_vars() 18823 1726855022.07158: Calling all_inventory to load vars for managed_node2 18823 1726855022.07161: Calling groups_inventory to load vars for managed_node2 18823 1726855022.07163: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855022.07172: Calling all_plugins_play to load vars for managed_node2 18823 1726855022.07175: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855022.07178: Calling groups_plugins_play to load vars for managed_node2 18823 1726855022.07745: done sending task result for task 0affcc66-ac2b-d391-077c-00000000027e 18823 1726855022.07749: WORKER PROCESS EXITING 18823 1726855022.07820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855022.08350: done with get_vars() 18823 1726855022.08365: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:57:02 -0400 (0:00:02.131) 0:00:13.736 ****** 18823 1726855022.08478: entering _queue_task() for managed_node2/package_facts 18823 1726855022.08479: Creating lock for package_facts 18823 1726855022.08891: worker is 1 (out of 1 available) 18823 1726855022.08907: exiting _queue_task() for managed_node2/package_facts 18823 1726855022.08919: done queuing things up, now waiting for results queue to drain 18823 1726855022.08920: waiting for pending results... 18823 1726855022.09199: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 18823 1726855022.09309: in run() - task 0affcc66-ac2b-d391-077c-00000000027f 18823 1726855022.09408: variable 'ansible_search_path' from source: unknown 18823 1726855022.09412: variable 'ansible_search_path' from source: unknown 18823 1726855022.09415: calling self._execute() 18823 1726855022.09473: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855022.09485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855022.09513: variable 'omit' from source: magic vars 18823 1726855022.09911: variable 'ansible_distribution_major_version' from source: facts 18823 1726855022.09928: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855022.09937: variable 'omit' from source: magic vars 18823 1726855022.09998: variable 'omit' from source: magic vars 18823 1726855022.10035: variable 'omit' from source: magic vars 18823 1726855022.10081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855022.10166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855022.10169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855022.10171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855022.10180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855022.10216: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855022.10223: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855022.10229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855022.10331: Set connection var ansible_timeout to 10 18823 1726855022.10342: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855022.10347: Set connection var ansible_shell_type to sh 18823 1726855022.10355: Set connection var ansible_shell_executable to /bin/sh 18823 1726855022.10363: Set connection var ansible_connection to ssh 18823 1726855022.10380: Set connection var ansible_pipelining to False 18823 1726855022.10411: variable 'ansible_shell_executable' from source: unknown 18823 1726855022.10492: variable 'ansible_connection' from source: unknown 18823 1726855022.10495: variable 'ansible_module_compression' from source: unknown 18823 1726855022.10497: variable 'ansible_shell_type' from source: unknown 18823 1726855022.10499: variable 'ansible_shell_executable' from source: unknown 18823 1726855022.10500: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855022.10505: variable 'ansible_pipelining' from source: unknown 18823 1726855022.10506: variable 'ansible_timeout' from source: unknown 18823 1726855022.10508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855022.10653: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855022.10670: variable 'omit' from source: magic vars 18823 1726855022.10680: starting attempt loop 18823 1726855022.10689: running the handler 18823 1726855022.10717: _low_level_execute_command(): starting 18823 1726855022.10730: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855022.11493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855022.11508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855022.11564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855022.11568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855022.11641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855022.13358: stdout chunk (state=3): >>>/root <<< 18823 1726855022.13515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855022.13520: stderr chunk (state=3): >>><<< 18823 1726855022.13544: stdout chunk (state=3): >>><<< 18823 1726855022.13665: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855022.13668: _low_level_execute_command(): starting 18823 1726855022.13672: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367 `" && echo ansible-tmp-1726855022.1356728-19529-84546836598367="` echo /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367 `" ) && sleep 0' 18823 1726855022.14234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855022.14294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855022.14310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855022.14382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855022.14411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855022.14444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855022.14540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855022.16500: stdout chunk (state=3): >>>ansible-tmp-1726855022.1356728-19529-84546836598367=/root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367 <<< 18823 1726855022.16661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855022.16665: stdout chunk (state=3): >>><<< 18823 1726855022.16667: stderr chunk (state=3): >>><<< 18823 1726855022.16897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855022.1356728-19529-84546836598367=/root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855022.16900: variable 'ansible_module_compression' from source: unknown 18823 1726855022.16903: ANSIBALLZ: Using lock for package_facts 18823 1726855022.16905: ANSIBALLZ: Acquiring lock 18823 1726855022.16907: ANSIBALLZ: Lock acquired: 140142267656064 18823 1726855022.16909: ANSIBALLZ: Creating module 18823 1726855022.66585: ANSIBALLZ: Writing module into payload 18823 1726855022.66922: ANSIBALLZ: Writing module 18823 1726855022.66955: ANSIBALLZ: Renaming module 18823 1726855022.67204: ANSIBALLZ: Done creating module 18823 1726855022.67208: variable 'ansible_facts' from source: unknown 18823 1726855022.67449: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/AnsiballZ_package_facts.py 18823 1726855022.67873: Sending initial data 18823 1726855022.67976: Sent initial data (161 bytes) 18823 1726855022.70160: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855022.70263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855022.70450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855022.70454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855022.70617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855022.70804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855022.72375: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855022.72431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855022.72531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp6wq3vzmi /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/AnsiballZ_package_facts.py <<< 18823 1726855022.72539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/AnsiballZ_package_facts.py" <<< 18823 1726855022.72633: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp6wq3vzmi" to remote "/root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/AnsiballZ_package_facts.py" <<< 18823 1726855022.75495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855022.75509: stderr chunk (state=3): >>><<< 18823 1726855022.75516: stdout chunk (state=3): >>><<< 18823 1726855022.75692: done transferring module to remote 18823 1726855022.75695: _low_level_execute_command(): starting 18823 1726855022.75697: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/ /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/AnsiballZ_package_facts.py && sleep 0' 18823 1726855022.76906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855022.76922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855022.76975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855022.77139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855022.77209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855022.77247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855022.77476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855022.79794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855022.79797: stdout chunk (state=3): >>><<< 18823 1726855022.79799: stderr chunk (state=3): >>><<< 18823 1726855022.79801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855022.79803: _low_level_execute_command(): starting 18823 1726855022.79805: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/AnsiballZ_package_facts.py && sleep 0' 18823 1726855022.81249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855022.81259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855022.81270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855022.81442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855022.81446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855022.81448: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855022.81450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855022.81453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855022.81455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855022.81520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855022.81693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855023.25821: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 18823 1726855023.26105: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18823 1726855023.27664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855023.27668: stdout chunk (state=3): >>><<< 18823 1726855023.27694: stderr chunk (state=3): >>><<< 18823 1726855023.27739: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855023.42196: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855023.42461: _low_level_execute_command(): starting 18823 1726855023.42465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855022.1356728-19529-84546836598367/ > /dev/null 2>&1 && sleep 0' 18823 1726855023.43862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855023.43866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855023.44289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855023.44294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855023.44297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855023.44299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855023.44301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855023.44528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855023.44637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855023.46640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855023.46643: stdout chunk (state=3): >>><<< 18823 1726855023.46645: stderr chunk (state=3): >>><<< 18823 1726855023.46668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855023.46676: handler run complete 18823 1726855023.48544: variable 'ansible_facts' from source: unknown 18823 1726855023.49376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855023.53670: variable 'ansible_facts' from source: unknown 18823 1726855023.54684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855023.56232: attempt loop complete, returning result 18823 1726855023.56250: _execute() done 18823 1726855023.56257: dumping result to json 18823 1726855023.56664: done dumping result, returning 18823 1726855023.56866: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-d391-077c-00000000027f] 18823 1726855023.56869: sending task result for task 0affcc66-ac2b-d391-077c-00000000027f 18823 1726855023.61629: done sending task result for task 0affcc66-ac2b-d391-077c-00000000027f 18823 1726855023.61633: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855023.61730: no more pending results, returning what we have 18823 1726855023.61733: results queue empty 18823 1726855023.61734: checking for any_errors_fatal 18823 1726855023.61738: done checking for any_errors_fatal 18823 1726855023.61739: checking for max_fail_percentage 18823 1726855023.61740: done checking for max_fail_percentage 18823 1726855023.61741: checking to see if all hosts have failed and the running result is not ok 18823 1726855023.61742: done checking to see if all hosts have failed 18823 1726855023.61743: getting the remaining hosts for this loop 18823 1726855023.61744: done getting the remaining hosts for this loop 18823 1726855023.61747: getting the next task for host managed_node2 18823 1726855023.61753: done getting next task for host managed_node2 18823 1726855023.61756: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18823 1726855023.61758: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855023.61767: getting variables 18823 1726855023.61768: in VariableManager get_vars() 18823 1726855023.61801: Calling all_inventory to load vars for managed_node2 18823 1726855023.61806: Calling groups_inventory to load vars for managed_node2 18823 1726855023.61809: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855023.61817: Calling all_plugins_play to load vars for managed_node2 18823 1726855023.61820: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855023.61823: Calling groups_plugins_play to load vars for managed_node2 18823 1726855023.64247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855023.67643: done with get_vars() 18823 1726855023.67672: done getting variables 18823 1726855023.67837: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:57:03 -0400 (0:00:01.593) 0:00:15.330 ****** 18823 1726855023.67869: entering _queue_task() for managed_node2/debug 18823 1726855023.68722: worker is 1 (out of 1 available) 18823 1726855023.68732: exiting _queue_task() for managed_node2/debug 18823 1726855023.68742: done queuing things up, now waiting for results queue to drain 18823 1726855023.68743: waiting for pending results... 18823 1726855023.69212: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 18823 1726855023.69217: in run() - task 0affcc66-ac2b-d391-077c-00000000001a 18823 1726855023.69220: variable 'ansible_search_path' from source: unknown 18823 1726855023.69223: variable 'ansible_search_path' from source: unknown 18823 1726855023.69226: calling self._execute() 18823 1726855023.69486: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855023.69606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855023.69626: variable 'omit' from source: magic vars 18823 1726855023.70383: variable 'ansible_distribution_major_version' from source: facts 18823 1726855023.70696: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855023.70700: variable 'omit' from source: magic vars 18823 1726855023.70705: variable 'omit' from source: magic vars 18823 1726855023.70709: variable 'network_provider' from source: set_fact 18823 1726855023.70712: variable 'omit' from source: magic vars 18823 1726855023.70925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855023.70962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855023.70991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855023.71095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855023.71115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855023.71155: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855023.71164: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855023.71173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855023.71374: Set connection var ansible_timeout to 10 18823 1726855023.71384: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855023.71459: Set connection var ansible_shell_type to sh 18823 1726855023.71471: Set connection var ansible_shell_executable to /bin/sh 18823 1726855023.71481: Set connection var ansible_connection to ssh 18823 1726855023.71495: Set connection var ansible_pipelining to False 18823 1726855023.71528: variable 'ansible_shell_executable' from source: unknown 18823 1726855023.71536: variable 'ansible_connection' from source: unknown 18823 1726855023.71569: variable 'ansible_module_compression' from source: unknown 18823 1726855023.71577: variable 'ansible_shell_type' from source: unknown 18823 1726855023.71584: variable 'ansible_shell_executable' from source: unknown 18823 1726855023.71593: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855023.71779: variable 'ansible_pipelining' from source: unknown 18823 1726855023.71783: variable 'ansible_timeout' from source: unknown 18823 1726855023.71785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855023.71932: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855023.71948: variable 'omit' from source: magic vars 18823 1726855023.71959: starting attempt loop 18823 1726855023.71999: running the handler 18823 1726855023.72049: handler run complete 18823 1726855023.72119: attempt loop complete, returning result 18823 1726855023.72126: _execute() done 18823 1726855023.72293: dumping result to json 18823 1726855023.72296: done dumping result, returning 18823 1726855023.72299: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-d391-077c-00000000001a] 18823 1726855023.72302: sending task result for task 0affcc66-ac2b-d391-077c-00000000001a 18823 1726855023.72372: done sending task result for task 0affcc66-ac2b-d391-077c-00000000001a 18823 1726855023.72374: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 18823 1726855023.72437: no more pending results, returning what we have 18823 1726855023.72440: results queue empty 18823 1726855023.72441: checking for any_errors_fatal 18823 1726855023.72452: done checking for any_errors_fatal 18823 1726855023.72453: checking for max_fail_percentage 18823 1726855023.72455: done checking for max_fail_percentage 18823 1726855023.72456: checking to see if all hosts have failed and the running result is not ok 18823 1726855023.72457: done checking to see if all hosts have failed 18823 1726855023.72458: getting the remaining hosts for this loop 18823 1726855023.72459: done getting the remaining hosts for this loop 18823 1726855023.72463: getting the next task for host managed_node2 18823 1726855023.72470: done getting next task for host managed_node2 18823 1726855023.72474: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18823 1726855023.72476: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855023.72486: getting variables 18823 1726855023.72491: in VariableManager get_vars() 18823 1726855023.72532: Calling all_inventory to load vars for managed_node2 18823 1726855023.72535: Calling groups_inventory to load vars for managed_node2 18823 1726855023.72537: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855023.72548: Calling all_plugins_play to load vars for managed_node2 18823 1726855023.72551: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855023.72554: Calling groups_plugins_play to load vars for managed_node2 18823 1726855023.75761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855023.79033: done with get_vars() 18823 1726855023.79067: done getting variables 18823 1726855023.79244: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:57:03 -0400 (0:00:00.114) 0:00:15.444 ****** 18823 1726855023.79277: entering _queue_task() for managed_node2/fail 18823 1726855023.80515: worker is 1 (out of 1 available) 18823 1726855023.80529: exiting _queue_task() for managed_node2/fail 18823 1726855023.80542: done queuing things up, now waiting for results queue to drain 18823 1726855023.80543: waiting for pending results... 18823 1726855023.81630: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18823 1726855023.81714: in run() - task 0affcc66-ac2b-d391-077c-00000000001b 18823 1726855023.81719: variable 'ansible_search_path' from source: unknown 18823 1726855023.81722: variable 'ansible_search_path' from source: unknown 18823 1726855023.82039: calling self._execute() 18823 1726855023.82043: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855023.82046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855023.82048: variable 'omit' from source: magic vars 18823 1726855023.83377: variable 'ansible_distribution_major_version' from source: facts 18823 1726855023.83674: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855023.83746: variable 'network_state' from source: role '' defaults 18823 1726855023.83909: Evaluated conditional (network_state != {}): False 18823 1726855023.84009: when evaluation is False, skipping this task 18823 1726855023.84018: _execute() done 18823 1726855023.84028: dumping result to json 18823 1726855023.84220: done dumping result, returning 18823 1726855023.84224: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-d391-077c-00000000001b] 18823 1726855023.84227: sending task result for task 0affcc66-ac2b-d391-077c-00000000001b 18823 1726855023.84308: done sending task result for task 0affcc66-ac2b-d391-077c-00000000001b 18823 1726855023.84312: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855023.84365: no more pending results, returning what we have 18823 1726855023.84370: results queue empty 18823 1726855023.84371: checking for any_errors_fatal 18823 1726855023.84378: done checking for any_errors_fatal 18823 1726855023.84379: checking for max_fail_percentage 18823 1726855023.84381: done checking for max_fail_percentage 18823 1726855023.84382: checking to see if all hosts have failed and the running result is not ok 18823 1726855023.84383: done checking to see if all hosts have failed 18823 1726855023.84384: getting the remaining hosts for this loop 18823 1726855023.84385: done getting the remaining hosts for this loop 18823 1726855023.84392: getting the next task for host managed_node2 18823 1726855023.84399: done getting next task for host managed_node2 18823 1726855023.84403: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18823 1726855023.84406: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855023.84421: getting variables 18823 1726855023.84423: in VariableManager get_vars() 18823 1726855023.84462: Calling all_inventory to load vars for managed_node2 18823 1726855023.84465: Calling groups_inventory to load vars for managed_node2 18823 1726855023.84468: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855023.84483: Calling all_plugins_play to load vars for managed_node2 18823 1726855023.84486: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855023.84696: Calling groups_plugins_play to load vars for managed_node2 18823 1726855023.88911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855023.94309: done with get_vars() 18823 1726855023.94354: done getting variables 18823 1726855023.94421: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:57:03 -0400 (0:00:00.151) 0:00:15.596 ****** 18823 1726855023.94454: entering _queue_task() for managed_node2/fail 18823 1726855023.95608: worker is 1 (out of 1 available) 18823 1726855023.95621: exiting _queue_task() for managed_node2/fail 18823 1726855023.95633: done queuing things up, now waiting for results queue to drain 18823 1726855023.95634: waiting for pending results... 18823 1726855023.96439: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18823 1726855023.96485: in run() - task 0affcc66-ac2b-d391-077c-00000000001c 18823 1726855023.96552: variable 'ansible_search_path' from source: unknown 18823 1726855023.96562: variable 'ansible_search_path' from source: unknown 18823 1726855023.96606: calling self._execute() 18823 1726855023.96863: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855023.96867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855023.96870: variable 'omit' from source: magic vars 18823 1726855023.97871: variable 'ansible_distribution_major_version' from source: facts 18823 1726855023.97893: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855023.98093: variable 'network_state' from source: role '' defaults 18823 1726855023.98142: Evaluated conditional (network_state != {}): False 18823 1726855023.98149: when evaluation is False, skipping this task 18823 1726855023.98293: _execute() done 18823 1726855023.98297: dumping result to json 18823 1726855023.98300: done dumping result, returning 18823 1726855023.98302: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-d391-077c-00000000001c] 18823 1726855023.98305: sending task result for task 0affcc66-ac2b-d391-077c-00000000001c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855023.98437: no more pending results, returning what we have 18823 1726855023.98441: results queue empty 18823 1726855023.98442: checking for any_errors_fatal 18823 1726855023.98450: done checking for any_errors_fatal 18823 1726855023.98451: checking for max_fail_percentage 18823 1726855023.98453: done checking for max_fail_percentage 18823 1726855023.98454: checking to see if all hosts have failed and the running result is not ok 18823 1726855023.98455: done checking to see if all hosts have failed 18823 1726855023.98455: getting the remaining hosts for this loop 18823 1726855023.98457: done getting the remaining hosts for this loop 18823 1726855023.98461: getting the next task for host managed_node2 18823 1726855023.98469: done getting next task for host managed_node2 18823 1726855023.98473: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18823 1726855023.98476: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855023.98493: getting variables 18823 1726855023.98495: in VariableManager get_vars() 18823 1726855023.98534: Calling all_inventory to load vars for managed_node2 18823 1726855023.98537: Calling groups_inventory to load vars for managed_node2 18823 1726855023.98539: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855023.98553: Calling all_plugins_play to load vars for managed_node2 18823 1726855023.98556: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855023.98559: Calling groups_plugins_play to load vars for managed_node2 18823 1726855023.99796: done sending task result for task 0affcc66-ac2b-d391-077c-00000000001c 18823 1726855023.99799: WORKER PROCESS EXITING 18823 1726855024.01478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855024.05519: done with get_vars() 18823 1726855024.05557: done getting variables 18823 1726855024.05625: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:57:04 -0400 (0:00:00.112) 0:00:15.708 ****** 18823 1726855024.05659: entering _queue_task() for managed_node2/fail 18823 1726855024.06407: worker is 1 (out of 1 available) 18823 1726855024.06422: exiting _queue_task() for managed_node2/fail 18823 1726855024.06436: done queuing things up, now waiting for results queue to drain 18823 1726855024.06437: waiting for pending results... 18823 1726855024.07137: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18823 1726855024.07368: in run() - task 0affcc66-ac2b-d391-077c-00000000001d 18823 1726855024.07614: variable 'ansible_search_path' from source: unknown 18823 1726855024.07706: variable 'ansible_search_path' from source: unknown 18823 1726855024.07754: calling self._execute() 18823 1726855024.07962: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855024.08077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855024.08095: variable 'omit' from source: magic vars 18823 1726855024.09494: variable 'ansible_distribution_major_version' from source: facts 18823 1726855024.09498: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855024.09907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855024.16661: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855024.16750: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855024.17036: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855024.17493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855024.17497: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855024.17500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.17505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.18095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.18099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.18101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.18103: variable 'ansible_distribution_major_version' from source: facts 18823 1726855024.18105: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18823 1726855024.18592: variable 'ansible_distribution' from source: facts 18823 1726855024.18892: variable '__network_rh_distros' from source: role '' defaults 18823 1726855024.18895: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18823 1726855024.19069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.19322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.19352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.19400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.19429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.19480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.19720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.19750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.19795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.19819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.19862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.19893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.20019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.20055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.20072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.20706: variable 'network_connections' from source: play vars 18823 1726855024.21093: variable 'interface' from source: set_fact 18823 1726855024.21096: variable 'interface' from source: set_fact 18823 1726855024.21098: variable 'interface' from source: set_fact 18823 1726855024.21101: variable 'interface' from source: set_fact 18823 1726855024.21106: variable 'network_state' from source: role '' defaults 18823 1726855024.21330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855024.21711: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855024.21769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855024.21810: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855024.21847: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855024.21930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855024.21966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855024.22061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.22093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855024.22180: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18823 1726855024.22190: when evaluation is False, skipping this task 18823 1726855024.22256: _execute() done 18823 1726855024.22266: dumping result to json 18823 1726855024.22274: done dumping result, returning 18823 1726855024.22289: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-d391-077c-00000000001d] 18823 1726855024.22301: sending task result for task 0affcc66-ac2b-d391-077c-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18823 1726855024.22459: no more pending results, returning what we have 18823 1726855024.22463: results queue empty 18823 1726855024.22465: checking for any_errors_fatal 18823 1726855024.22470: done checking for any_errors_fatal 18823 1726855024.22471: checking for max_fail_percentage 18823 1726855024.22473: done checking for max_fail_percentage 18823 1726855024.22474: checking to see if all hosts have failed and the running result is not ok 18823 1726855024.22474: done checking to see if all hosts have failed 18823 1726855024.22475: getting the remaining hosts for this loop 18823 1726855024.22477: done getting the remaining hosts for this loop 18823 1726855024.22481: getting the next task for host managed_node2 18823 1726855024.22490: done getting next task for host managed_node2 18823 1726855024.22494: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18823 1726855024.22496: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855024.22512: getting variables 18823 1726855024.22514: in VariableManager get_vars() 18823 1726855024.22553: Calling all_inventory to load vars for managed_node2 18823 1726855024.22556: Calling groups_inventory to load vars for managed_node2 18823 1726855024.22558: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855024.22569: Calling all_plugins_play to load vars for managed_node2 18823 1726855024.22572: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855024.22576: Calling groups_plugins_play to load vars for managed_node2 18823 1726855024.23501: done sending task result for task 0affcc66-ac2b-d391-077c-00000000001d 18823 1726855024.23507: WORKER PROCESS EXITING 18823 1726855024.25901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855024.29089: done with get_vars() 18823 1726855024.29209: done getting variables 18823 1726855024.29426: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:57:04 -0400 (0:00:00.238) 0:00:15.946 ****** 18823 1726855024.29516: entering _queue_task() for managed_node2/dnf 18823 1726855024.30163: worker is 1 (out of 1 available) 18823 1726855024.30178: exiting _queue_task() for managed_node2/dnf 18823 1726855024.30396: done queuing things up, now waiting for results queue to drain 18823 1726855024.30397: waiting for pending results... 18823 1726855024.30696: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18823 1726855024.31269: in run() - task 0affcc66-ac2b-d391-077c-00000000001e 18823 1726855024.31274: variable 'ansible_search_path' from source: unknown 18823 1726855024.31277: variable 'ansible_search_path' from source: unknown 18823 1726855024.31280: calling self._execute() 18823 1726855024.31441: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855024.31608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855024.31626: variable 'omit' from source: magic vars 18823 1726855024.32868: variable 'ansible_distribution_major_version' from source: facts 18823 1726855024.32888: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855024.33595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855024.40362: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855024.40624: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855024.40791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855024.40978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855024.41040: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855024.41320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.41492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.41621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.41812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.41839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.42380: variable 'ansible_distribution' from source: facts 18823 1726855024.42384: variable 'ansible_distribution_major_version' from source: facts 18823 1726855024.42386: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18823 1726855024.42812: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855024.43206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.43425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.43458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.43623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.43645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.44107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.44111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.44114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.44117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.44294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.44297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.44300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.44647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.44651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.44654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.45163: variable 'network_connections' from source: play vars 18823 1726855024.45474: variable 'interface' from source: set_fact 18823 1726855024.45500: variable 'interface' from source: set_fact 18823 1726855024.45599: variable 'interface' from source: set_fact 18823 1726855024.45907: variable 'interface' from source: set_fact 18823 1726855024.46041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855024.46638: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855024.46793: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855024.47025: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855024.47142: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855024.47231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855024.47385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855024.47436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.47610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855024.47840: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855024.48567: variable 'network_connections' from source: play vars 18823 1726855024.48994: variable 'interface' from source: set_fact 18823 1726855024.48997: variable 'interface' from source: set_fact 18823 1726855024.49000: variable 'interface' from source: set_fact 18823 1726855024.49136: variable 'interface' from source: set_fact 18823 1726855024.49175: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855024.49493: when evaluation is False, skipping this task 18823 1726855024.49497: _execute() done 18823 1726855024.49500: dumping result to json 18823 1726855024.49505: done dumping result, returning 18823 1726855024.49509: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-00000000001e] 18823 1726855024.49511: sending task result for task 0affcc66-ac2b-d391-077c-00000000001e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855024.49641: no more pending results, returning what we have 18823 1726855024.49645: results queue empty 18823 1726855024.49646: checking for any_errors_fatal 18823 1726855024.49654: done checking for any_errors_fatal 18823 1726855024.49655: checking for max_fail_percentage 18823 1726855024.49657: done checking for max_fail_percentage 18823 1726855024.49658: checking to see if all hosts have failed and the running result is not ok 18823 1726855024.49659: done checking to see if all hosts have failed 18823 1726855024.49659: getting the remaining hosts for this loop 18823 1726855024.49661: done getting the remaining hosts for this loop 18823 1726855024.49665: getting the next task for host managed_node2 18823 1726855024.49672: done getting next task for host managed_node2 18823 1726855024.49677: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18823 1726855024.49679: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855024.49695: getting variables 18823 1726855024.49697: in VariableManager get_vars() 18823 1726855024.49735: Calling all_inventory to load vars for managed_node2 18823 1726855024.49738: Calling groups_inventory to load vars for managed_node2 18823 1726855024.49740: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855024.49752: Calling all_plugins_play to load vars for managed_node2 18823 1726855024.49755: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855024.49758: Calling groups_plugins_play to load vars for managed_node2 18823 1726855024.50556: done sending task result for task 0affcc66-ac2b-d391-077c-00000000001e 18823 1726855024.50560: WORKER PROCESS EXITING 18823 1726855024.52680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855024.57085: done with get_vars() 18823 1726855024.57124: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18823 1726855024.57325: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:57:04 -0400 (0:00:00.278) 0:00:16.225 ****** 18823 1726855024.57355: entering _queue_task() for managed_node2/yum 18823 1726855024.57357: Creating lock for yum 18823 1726855024.58279: worker is 1 (out of 1 available) 18823 1726855024.58294: exiting _queue_task() for managed_node2/yum 18823 1726855024.58305: done queuing things up, now waiting for results queue to drain 18823 1726855024.58306: waiting for pending results... 18823 1726855024.58688: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18823 1726855024.58902: in run() - task 0affcc66-ac2b-d391-077c-00000000001f 18823 1726855024.58982: variable 'ansible_search_path' from source: unknown 18823 1726855024.58994: variable 'ansible_search_path' from source: unknown 18823 1726855024.59044: calling self._execute() 18823 1726855024.59274: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855024.59293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855024.59309: variable 'omit' from source: magic vars 18823 1726855024.60154: variable 'ansible_distribution_major_version' from source: facts 18823 1726855024.60169: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855024.60591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855024.65545: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855024.66551: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855024.66861: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855024.66865: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855024.66867: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855024.66924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.67002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.67109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.67169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.67312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.67457: variable 'ansible_distribution_major_version' from source: facts 18823 1726855024.67516: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18823 1726855024.67524: when evaluation is False, skipping this task 18823 1726855024.67532: _execute() done 18823 1726855024.67647: dumping result to json 18823 1726855024.67651: done dumping result, returning 18823 1726855024.67654: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-00000000001f] 18823 1726855024.67656: sending task result for task 0affcc66-ac2b-d391-077c-00000000001f skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18823 1726855024.67945: no more pending results, returning what we have 18823 1726855024.67948: results queue empty 18823 1726855024.67950: checking for any_errors_fatal 18823 1726855024.67954: done checking for any_errors_fatal 18823 1726855024.67955: checking for max_fail_percentage 18823 1726855024.67956: done checking for max_fail_percentage 18823 1726855024.67957: checking to see if all hosts have failed and the running result is not ok 18823 1726855024.67958: done checking to see if all hosts have failed 18823 1726855024.67959: getting the remaining hosts for this loop 18823 1726855024.67960: done getting the remaining hosts for this loop 18823 1726855024.67965: getting the next task for host managed_node2 18823 1726855024.67973: done getting next task for host managed_node2 18823 1726855024.67977: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18823 1726855024.67979: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855024.67998: getting variables 18823 1726855024.68000: in VariableManager get_vars() 18823 1726855024.68040: Calling all_inventory to load vars for managed_node2 18823 1726855024.68043: Calling groups_inventory to load vars for managed_node2 18823 1726855024.68046: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855024.68058: Calling all_plugins_play to load vars for managed_node2 18823 1726855024.68061: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855024.68064: Calling groups_plugins_play to load vars for managed_node2 18823 1726855024.68602: done sending task result for task 0affcc66-ac2b-d391-077c-00000000001f 18823 1726855024.68606: WORKER PROCESS EXITING 18823 1726855024.80182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855024.83469: done with get_vars() 18823 1726855024.83505: done getting variables 18823 1726855024.83676: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:57:04 -0400 (0:00:00.263) 0:00:16.488 ****** 18823 1726855024.83707: entering _queue_task() for managed_node2/fail 18823 1726855024.84360: worker is 1 (out of 1 available) 18823 1726855024.84374: exiting _queue_task() for managed_node2/fail 18823 1726855024.84386: done queuing things up, now waiting for results queue to drain 18823 1726855024.84645: waiting for pending results... 18823 1726855024.84935: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18823 1726855024.85231: in run() - task 0affcc66-ac2b-d391-077c-000000000020 18823 1726855024.85235: variable 'ansible_search_path' from source: unknown 18823 1726855024.85242: variable 'ansible_search_path' from source: unknown 18823 1726855024.85285: calling self._execute() 18823 1726855024.85546: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855024.85665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855024.85669: variable 'omit' from source: magic vars 18823 1726855024.86414: variable 'ansible_distribution_major_version' from source: facts 18823 1726855024.86434: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855024.86755: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855024.87081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855024.91734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855024.91792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855024.92096: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855024.92100: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855024.92102: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855024.92185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.92344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.92380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.92472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.92494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.92599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.92663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.92707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.92779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.92873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.92924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855024.92991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855024.93101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.93145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855024.93197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855024.93655: variable 'network_connections' from source: play vars 18823 1726855024.93658: variable 'interface' from source: set_fact 18823 1726855024.93776: variable 'interface' from source: set_fact 18823 1726855024.93802: variable 'interface' from source: set_fact 18823 1726855024.93963: variable 'interface' from source: set_fact 18823 1726855024.94115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855024.94611: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855024.94615: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855024.94618: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855024.94793: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855024.94797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855024.94902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855024.94939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855024.94977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855024.95114: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855024.95824: variable 'network_connections' from source: play vars 18823 1726855024.95828: variable 'interface' from source: set_fact 18823 1726855024.95831: variable 'interface' from source: set_fact 18823 1726855024.95833: variable 'interface' from source: set_fact 18823 1726855024.95952: variable 'interface' from source: set_fact 18823 1726855024.95989: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855024.96046: when evaluation is False, skipping this task 18823 1726855024.96055: _execute() done 18823 1726855024.96063: dumping result to json 18823 1726855024.96071: done dumping result, returning 18823 1726855024.96082: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000020] 18823 1726855024.96103: sending task result for task 0affcc66-ac2b-d391-077c-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855024.96307: no more pending results, returning what we have 18823 1726855024.96312: results queue empty 18823 1726855024.96313: checking for any_errors_fatal 18823 1726855024.96322: done checking for any_errors_fatal 18823 1726855024.96323: checking for max_fail_percentage 18823 1726855024.96326: done checking for max_fail_percentage 18823 1726855024.96327: checking to see if all hosts have failed and the running result is not ok 18823 1726855024.96328: done checking to see if all hosts have failed 18823 1726855024.96329: getting the remaining hosts for this loop 18823 1726855024.96330: done getting the remaining hosts for this loop 18823 1726855024.96334: getting the next task for host managed_node2 18823 1726855024.96342: done getting next task for host managed_node2 18823 1726855024.96346: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18823 1726855024.96348: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855024.96363: getting variables 18823 1726855024.96365: in VariableManager get_vars() 18823 1726855024.96405: Calling all_inventory to load vars for managed_node2 18823 1726855024.96408: Calling groups_inventory to load vars for managed_node2 18823 1726855024.96411: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855024.96422: Calling all_plugins_play to load vars for managed_node2 18823 1726855024.96425: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855024.96428: Calling groups_plugins_play to load vars for managed_node2 18823 1726855024.97358: done sending task result for task 0affcc66-ac2b-d391-077c-000000000020 18823 1726855024.97362: WORKER PROCESS EXITING 18823 1726855024.99651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855025.03263: done with get_vars() 18823 1726855025.03294: done getting variables 18823 1726855025.03378: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:57:05 -0400 (0:00:00.197) 0:00:16.685 ****** 18823 1726855025.03420: entering _queue_task() for managed_node2/package 18823 1726855025.03781: worker is 1 (out of 1 available) 18823 1726855025.03796: exiting _queue_task() for managed_node2/package 18823 1726855025.03809: done queuing things up, now waiting for results queue to drain 18823 1726855025.03810: waiting for pending results... 18823 1726855025.04092: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 18823 1726855025.04212: in run() - task 0affcc66-ac2b-d391-077c-000000000021 18823 1726855025.04310: variable 'ansible_search_path' from source: unknown 18823 1726855025.04317: variable 'ansible_search_path' from source: unknown 18823 1726855025.04355: calling self._execute() 18823 1726855025.04583: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855025.04597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855025.04620: variable 'omit' from source: magic vars 18823 1726855025.04995: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.05079: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855025.05409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855025.06319: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855025.06382: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855025.06533: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855025.06731: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855025.07045: variable 'network_packages' from source: role '' defaults 18823 1726855025.07295: variable '__network_provider_setup' from source: role '' defaults 18823 1726855025.07320: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855025.07526: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855025.07633: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855025.07664: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855025.07865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855025.10371: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855025.10449: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855025.10500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855025.10563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855025.10572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855025.10659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.10704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.10777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.10802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.10889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.10893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.10928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.10995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.11029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.11062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.11485: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18823 1726855025.11796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.11800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.11803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.11805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.11807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.11879: variable 'ansible_python' from source: facts 18823 1726855025.11913: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18823 1726855025.11994: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855025.12077: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855025.12210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.12238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.12262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.12302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.12320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.12368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.12392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.12419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.12461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.12475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.12657: variable 'network_connections' from source: play vars 18823 1726855025.12665: variable 'interface' from source: set_fact 18823 1726855025.12770: variable 'interface' from source: set_fact 18823 1726855025.12776: variable 'interface' from source: set_fact 18823 1726855025.12901: variable 'interface' from source: set_fact 18823 1726855025.13170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855025.13225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855025.13250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.13276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855025.13629: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855025.14430: variable 'network_connections' from source: play vars 18823 1726855025.14434: variable 'interface' from source: set_fact 18823 1726855025.14690: variable 'interface' from source: set_fact 18823 1726855025.14700: variable 'interface' from source: set_fact 18823 1726855025.14968: variable 'interface' from source: set_fact 18823 1726855025.15223: variable '__network_packages_default_wireless' from source: role '' defaults 18823 1726855025.15226: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855025.15947: variable 'network_connections' from source: play vars 18823 1726855025.15950: variable 'interface' from source: set_fact 18823 1726855025.16065: variable 'interface' from source: set_fact 18823 1726855025.16077: variable 'interface' from source: set_fact 18823 1726855025.16143: variable 'interface' from source: set_fact 18823 1726855025.16168: variable '__network_packages_default_team' from source: role '' defaults 18823 1726855025.16276: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855025.16850: variable 'network_connections' from source: play vars 18823 1726855025.16853: variable 'interface' from source: set_fact 18823 1726855025.17083: variable 'interface' from source: set_fact 18823 1726855025.17093: variable 'interface' from source: set_fact 18823 1726855025.17150: variable 'interface' from source: set_fact 18823 1726855025.17330: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855025.17384: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855025.17419: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855025.17474: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855025.17953: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18823 1726855025.19394: variable 'network_connections' from source: play vars 18823 1726855025.19398: variable 'interface' from source: set_fact 18823 1726855025.19400: variable 'interface' from source: set_fact 18823 1726855025.19402: variable 'interface' from source: set_fact 18823 1726855025.19404: variable 'interface' from source: set_fact 18823 1726855025.19405: variable 'ansible_distribution' from source: facts 18823 1726855025.19407: variable '__network_rh_distros' from source: role '' defaults 18823 1726855025.19409: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.19635: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18823 1726855025.19692: variable 'ansible_distribution' from source: facts 18823 1726855025.19760: variable '__network_rh_distros' from source: role '' defaults 18823 1726855025.19763: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.19778: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18823 1726855025.20274: variable 'ansible_distribution' from source: facts 18823 1726855025.20277: variable '__network_rh_distros' from source: role '' defaults 18823 1726855025.20279: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.20281: variable 'network_provider' from source: set_fact 18823 1726855025.20283: variable 'ansible_facts' from source: unknown 18823 1726855025.21148: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18823 1726855025.21152: when evaluation is False, skipping this task 18823 1726855025.21156: _execute() done 18823 1726855025.21192: dumping result to json 18823 1726855025.21195: done dumping result, returning 18823 1726855025.21201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-d391-077c-000000000021] 18823 1726855025.21204: sending task result for task 0affcc66-ac2b-d391-077c-000000000021 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18823 1726855025.21444: no more pending results, returning what we have 18823 1726855025.21448: results queue empty 18823 1726855025.21449: checking for any_errors_fatal 18823 1726855025.21454: done checking for any_errors_fatal 18823 1726855025.21454: checking for max_fail_percentage 18823 1726855025.21456: done checking for max_fail_percentage 18823 1726855025.21457: checking to see if all hosts have failed and the running result is not ok 18823 1726855025.21459: done checking to see if all hosts have failed 18823 1726855025.21460: getting the remaining hosts for this loop 18823 1726855025.21461: done getting the remaining hosts for this loop 18823 1726855025.21464: getting the next task for host managed_node2 18823 1726855025.21470: done getting next task for host managed_node2 18823 1726855025.21473: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18823 1726855025.21475: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855025.21492: getting variables 18823 1726855025.21493: in VariableManager get_vars() 18823 1726855025.21526: Calling all_inventory to load vars for managed_node2 18823 1726855025.21528: Calling groups_inventory to load vars for managed_node2 18823 1726855025.21530: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855025.21539: Calling all_plugins_play to load vars for managed_node2 18823 1726855025.21545: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855025.21548: Calling groups_plugins_play to load vars for managed_node2 18823 1726855025.22295: done sending task result for task 0affcc66-ac2b-d391-077c-000000000021 18823 1726855025.22298: WORKER PROCESS EXITING 18823 1726855025.24850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855025.26521: done with get_vars() 18823 1726855025.26546: done getting variables 18823 1726855025.26614: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:57:05 -0400 (0:00:00.232) 0:00:16.918 ****** 18823 1726855025.26645: entering _queue_task() for managed_node2/package 18823 1726855025.27006: worker is 1 (out of 1 available) 18823 1726855025.27019: exiting _queue_task() for managed_node2/package 18823 1726855025.27177: done queuing things up, now waiting for results queue to drain 18823 1726855025.27179: waiting for pending results... 18823 1726855025.27679: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18823 1726855025.28116: in run() - task 0affcc66-ac2b-d391-077c-000000000022 18823 1726855025.28120: variable 'ansible_search_path' from source: unknown 18823 1726855025.28123: variable 'ansible_search_path' from source: unknown 18823 1726855025.28126: calling self._execute() 18823 1726855025.28268: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855025.28342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855025.28357: variable 'omit' from source: magic vars 18823 1726855025.29083: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.29109: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855025.29263: variable 'network_state' from source: role '' defaults 18823 1726855025.29313: Evaluated conditional (network_state != {}): False 18823 1726855025.29324: when evaluation is False, skipping this task 18823 1726855025.29332: _execute() done 18823 1726855025.29339: dumping result to json 18823 1726855025.29348: done dumping result, returning 18823 1726855025.29361: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-d391-077c-000000000022] 18823 1726855025.29373: sending task result for task 0affcc66-ac2b-d391-077c-000000000022 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855025.29552: no more pending results, returning what we have 18823 1726855025.29557: results queue empty 18823 1726855025.29558: checking for any_errors_fatal 18823 1726855025.29569: done checking for any_errors_fatal 18823 1726855025.29570: checking for max_fail_percentage 18823 1726855025.29572: done checking for max_fail_percentage 18823 1726855025.29573: checking to see if all hosts have failed and the running result is not ok 18823 1726855025.29573: done checking to see if all hosts have failed 18823 1726855025.29574: getting the remaining hosts for this loop 18823 1726855025.29576: done getting the remaining hosts for this loop 18823 1726855025.29580: getting the next task for host managed_node2 18823 1726855025.29589: done getting next task for host managed_node2 18823 1726855025.29593: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18823 1726855025.29595: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855025.29611: getting variables 18823 1726855025.29613: in VariableManager get_vars() 18823 1726855025.29654: Calling all_inventory to load vars for managed_node2 18823 1726855025.29657: Calling groups_inventory to load vars for managed_node2 18823 1726855025.29660: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855025.29809: done sending task result for task 0affcc66-ac2b-d391-077c-000000000022 18823 1726855025.29812: WORKER PROCESS EXITING 18823 1726855025.29901: Calling all_plugins_play to load vars for managed_node2 18823 1726855025.29910: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855025.29915: Calling groups_plugins_play to load vars for managed_node2 18823 1726855025.31391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855025.33537: done with get_vars() 18823 1726855025.33559: done getting variables 18823 1726855025.33677: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:57:05 -0400 (0:00:00.070) 0:00:16.988 ****** 18823 1726855025.33731: entering _queue_task() for managed_node2/package 18823 1726855025.34240: worker is 1 (out of 1 available) 18823 1726855025.34256: exiting _queue_task() for managed_node2/package 18823 1726855025.34269: done queuing things up, now waiting for results queue to drain 18823 1726855025.34270: waiting for pending results... 18823 1726855025.34531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18823 1726855025.34857: in run() - task 0affcc66-ac2b-d391-077c-000000000023 18823 1726855025.34861: variable 'ansible_search_path' from source: unknown 18823 1726855025.34865: variable 'ansible_search_path' from source: unknown 18823 1726855025.34870: calling self._execute() 18823 1726855025.35023: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855025.35041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855025.35147: variable 'omit' from source: magic vars 18823 1726855025.35708: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.35723: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855025.35871: variable 'network_state' from source: role '' defaults 18823 1726855025.35892: Evaluated conditional (network_state != {}): False 18823 1726855025.35909: when evaluation is False, skipping this task 18823 1726855025.35934: _execute() done 18823 1726855025.35942: dumping result to json 18823 1726855025.35950: done dumping result, returning 18823 1726855025.35993: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-d391-077c-000000000023] 18823 1726855025.35996: sending task result for task 0affcc66-ac2b-d391-077c-000000000023 18823 1726855025.36350: done sending task result for task 0affcc66-ac2b-d391-077c-000000000023 18823 1726855025.36353: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855025.36414: no more pending results, returning what we have 18823 1726855025.36419: results queue empty 18823 1726855025.36420: checking for any_errors_fatal 18823 1726855025.36428: done checking for any_errors_fatal 18823 1726855025.36429: checking for max_fail_percentage 18823 1726855025.36431: done checking for max_fail_percentage 18823 1726855025.36432: checking to see if all hosts have failed and the running result is not ok 18823 1726855025.36433: done checking to see if all hosts have failed 18823 1726855025.36433: getting the remaining hosts for this loop 18823 1726855025.36435: done getting the remaining hosts for this loop 18823 1726855025.36441: getting the next task for host managed_node2 18823 1726855025.36449: done getting next task for host managed_node2 18823 1726855025.36452: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18823 1726855025.36455: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855025.36470: getting variables 18823 1726855025.36473: in VariableManager get_vars() 18823 1726855025.36517: Calling all_inventory to load vars for managed_node2 18823 1726855025.36522: Calling groups_inventory to load vars for managed_node2 18823 1726855025.36526: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855025.36539: Calling all_plugins_play to load vars for managed_node2 18823 1726855025.36546: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855025.36550: Calling groups_plugins_play to load vars for managed_node2 18823 1726855025.39256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855025.40889: done with get_vars() 18823 1726855025.40914: done getting variables 18823 1726855025.41028: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:57:05 -0400 (0:00:00.073) 0:00:17.062 ****** 18823 1726855025.41059: entering _queue_task() for managed_node2/service 18823 1726855025.41061: Creating lock for service 18823 1726855025.41608: worker is 1 (out of 1 available) 18823 1726855025.41618: exiting _queue_task() for managed_node2/service 18823 1726855025.41628: done queuing things up, now waiting for results queue to drain 18823 1726855025.41629: waiting for pending results... 18823 1726855025.41715: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18823 1726855025.41857: in run() - task 0affcc66-ac2b-d391-077c-000000000024 18823 1726855025.41861: variable 'ansible_search_path' from source: unknown 18823 1726855025.41864: variable 'ansible_search_path' from source: unknown 18823 1726855025.41895: calling self._execute() 18823 1726855025.41996: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855025.42009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855025.42023: variable 'omit' from source: magic vars 18823 1726855025.42514: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.42518: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855025.42592: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855025.42808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855025.46696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855025.46799: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855025.46848: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855025.46933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855025.47029: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855025.47294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.47298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.47300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.47305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.47308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.47351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.47452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.47512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.47568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.47590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.47645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.47694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.47720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.47856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.47859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.48015: variable 'network_connections' from source: play vars 18823 1726855025.48034: variable 'interface' from source: set_fact 18823 1726855025.48127: variable 'interface' from source: set_fact 18823 1726855025.48144: variable 'interface' from source: set_fact 18823 1726855025.48223: variable 'interface' from source: set_fact 18823 1726855025.48309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855025.49079: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855025.49131: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855025.49223: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855025.49333: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855025.49513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855025.49516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855025.49518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.49520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855025.49544: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855025.50139: variable 'network_connections' from source: play vars 18823 1726855025.50247: variable 'interface' from source: set_fact 18823 1726855025.50345: variable 'interface' from source: set_fact 18823 1726855025.50356: variable 'interface' from source: set_fact 18823 1726855025.50427: variable 'interface' from source: set_fact 18823 1726855025.50460: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855025.50466: when evaluation is False, skipping this task 18823 1726855025.50472: _execute() done 18823 1726855025.50478: dumping result to json 18823 1726855025.50483: done dumping result, returning 18823 1726855025.50496: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000024] 18823 1726855025.50521: sending task result for task 0affcc66-ac2b-d391-077c-000000000024 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855025.50753: no more pending results, returning what we have 18823 1726855025.50757: results queue empty 18823 1726855025.50758: checking for any_errors_fatal 18823 1726855025.50770: done checking for any_errors_fatal 18823 1726855025.50771: checking for max_fail_percentage 18823 1726855025.50774: done checking for max_fail_percentage 18823 1726855025.50775: checking to see if all hosts have failed and the running result is not ok 18823 1726855025.50776: done checking to see if all hosts have failed 18823 1726855025.50776: getting the remaining hosts for this loop 18823 1726855025.50778: done getting the remaining hosts for this loop 18823 1726855025.50783: getting the next task for host managed_node2 18823 1726855025.50792: done getting next task for host managed_node2 18823 1726855025.50796: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18823 1726855025.50798: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855025.50815: getting variables 18823 1726855025.50818: in VariableManager get_vars() 18823 1726855025.50861: Calling all_inventory to load vars for managed_node2 18823 1726855025.50864: Calling groups_inventory to load vars for managed_node2 18823 1726855025.50867: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855025.50879: Calling all_plugins_play to load vars for managed_node2 18823 1726855025.50882: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855025.50886: Calling groups_plugins_play to load vars for managed_node2 18823 1726855025.51729: done sending task result for task 0affcc66-ac2b-d391-077c-000000000024 18823 1726855025.51733: WORKER PROCESS EXITING 18823 1726855025.53132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855025.55479: done with get_vars() 18823 1726855025.55516: done getting variables 18823 1726855025.55589: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:57:05 -0400 (0:00:00.145) 0:00:17.207 ****** 18823 1726855025.55624: entering _queue_task() for managed_node2/service 18823 1726855025.56132: worker is 1 (out of 1 available) 18823 1726855025.56148: exiting _queue_task() for managed_node2/service 18823 1726855025.56160: done queuing things up, now waiting for results queue to drain 18823 1726855025.56160: waiting for pending results... 18823 1726855025.56462: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18823 1726855025.56618: in run() - task 0affcc66-ac2b-d391-077c-000000000025 18823 1726855025.56653: variable 'ansible_search_path' from source: unknown 18823 1726855025.56714: variable 'ansible_search_path' from source: unknown 18823 1726855025.56745: calling self._execute() 18823 1726855025.56968: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855025.56979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855025.57020: variable 'omit' from source: magic vars 18823 1726855025.57473: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.57494: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855025.57670: variable 'network_provider' from source: set_fact 18823 1726855025.57673: variable 'network_state' from source: role '' defaults 18823 1726855025.57781: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18823 1726855025.57784: variable 'omit' from source: magic vars 18823 1726855025.57786: variable 'omit' from source: magic vars 18823 1726855025.57790: variable 'network_service_name' from source: role '' defaults 18823 1726855025.57907: variable 'network_service_name' from source: role '' defaults 18823 1726855025.58146: variable '__network_provider_setup' from source: role '' defaults 18823 1726855025.58157: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855025.58233: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855025.58253: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855025.58353: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855025.58722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855025.61744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855025.61891: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855025.62027: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855025.62030: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855025.62041: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855025.62163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.62215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.62313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.62460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.62464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.62479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.62514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.62542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.62592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.62620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.62938: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18823 1726855025.63116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.63162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.63224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.63247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.63272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.63395: variable 'ansible_python' from source: facts 18823 1726855025.63475: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18823 1726855025.63554: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855025.63667: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855025.63907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.63913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.63918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.63979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.64009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.64107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855025.64147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855025.64181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.64226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855025.64252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855025.64470: variable 'network_connections' from source: play vars 18823 1726855025.64475: variable 'interface' from source: set_fact 18823 1726855025.64520: variable 'interface' from source: set_fact 18823 1726855025.64536: variable 'interface' from source: set_fact 18823 1726855025.64634: variable 'interface' from source: set_fact 18823 1726855025.64781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855025.65066: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855025.65212: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855025.65366: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855025.65369: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855025.65459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855025.65499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855025.65584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855025.65668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855025.65797: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855025.66400: variable 'network_connections' from source: play vars 18823 1726855025.66403: variable 'interface' from source: set_fact 18823 1726855025.66434: variable 'interface' from source: set_fact 18823 1726855025.66450: variable 'interface' from source: set_fact 18823 1726855025.66645: variable 'interface' from source: set_fact 18823 1726855025.66804: variable '__network_packages_default_wireless' from source: role '' defaults 18823 1726855025.67020: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855025.67795: variable 'network_connections' from source: play vars 18823 1726855025.67819: variable 'interface' from source: set_fact 18823 1726855025.67893: variable 'interface' from source: set_fact 18823 1726855025.67904: variable 'interface' from source: set_fact 18823 1726855025.68007: variable 'interface' from source: set_fact 18823 1726855025.68044: variable '__network_packages_default_team' from source: role '' defaults 18823 1726855025.68125: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855025.68449: variable 'network_connections' from source: play vars 18823 1726855025.68499: variable 'interface' from source: set_fact 18823 1726855025.68576: variable 'interface' from source: set_fact 18823 1726855025.68671: variable 'interface' from source: set_fact 18823 1726855025.68770: variable 'interface' from source: set_fact 18823 1726855025.68836: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855025.68975: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855025.68986: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855025.69081: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855025.69437: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18823 1726855025.69981: variable 'network_connections' from source: play vars 18823 1726855025.69999: variable 'interface' from source: set_fact 18823 1726855025.70060: variable 'interface' from source: set_fact 18823 1726855025.70072: variable 'interface' from source: set_fact 18823 1726855025.70160: variable 'interface' from source: set_fact 18823 1726855025.70180: variable 'ansible_distribution' from source: facts 18823 1726855025.70203: variable '__network_rh_distros' from source: role '' defaults 18823 1726855025.70294: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.70297: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18823 1726855025.70683: variable 'ansible_distribution' from source: facts 18823 1726855025.70719: variable '__network_rh_distros' from source: role '' defaults 18823 1726855025.71096: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.71099: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18823 1726855025.71298: variable 'ansible_distribution' from source: facts 18823 1726855025.71329: variable '__network_rh_distros' from source: role '' defaults 18823 1726855025.71344: variable 'ansible_distribution_major_version' from source: facts 18823 1726855025.71398: variable 'network_provider' from source: set_fact 18823 1726855025.71565: variable 'omit' from source: magic vars 18823 1726855025.71613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855025.71647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855025.72009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855025.72012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855025.72015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855025.72018: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855025.72020: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855025.72022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855025.72110: Set connection var ansible_timeout to 10 18823 1726855025.72136: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855025.72139: Set connection var ansible_shell_type to sh 18823 1726855025.72143: Set connection var ansible_shell_executable to /bin/sh 18823 1726855025.72151: Set connection var ansible_connection to ssh 18823 1726855025.72159: Set connection var ansible_pipelining to False 18823 1726855025.72299: variable 'ansible_shell_executable' from source: unknown 18823 1726855025.72302: variable 'ansible_connection' from source: unknown 18823 1726855025.72306: variable 'ansible_module_compression' from source: unknown 18823 1726855025.72312: variable 'ansible_shell_type' from source: unknown 18823 1726855025.72317: variable 'ansible_shell_executable' from source: unknown 18823 1726855025.72322: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855025.72338: variable 'ansible_pipelining' from source: unknown 18823 1726855025.72343: variable 'ansible_timeout' from source: unknown 18823 1726855025.72662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855025.72666: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855025.72668: variable 'omit' from source: magic vars 18823 1726855025.72670: starting attempt loop 18823 1726855025.72673: running the handler 18823 1726855025.72828: variable 'ansible_facts' from source: unknown 18823 1726855025.74521: _low_level_execute_command(): starting 18823 1726855025.74605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855025.76018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855025.76089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855025.76129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855025.76208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855025.77926: stdout chunk (state=3): >>>/root <<< 18823 1726855025.78197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855025.78201: stdout chunk (state=3): >>><<< 18823 1726855025.78282: stderr chunk (state=3): >>><<< 18823 1726855025.78285: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855025.78293: _low_level_execute_command(): starting 18823 1726855025.78297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825 `" && echo ansible-tmp-1726855025.7823522-19673-246755000002825="` echo /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825 `" ) && sleep 0' 18823 1726855025.79476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855025.79496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855025.79509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855025.79608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855025.79612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855025.79676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855025.79679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855025.79709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855025.79902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855025.81739: stdout chunk (state=3): >>>ansible-tmp-1726855025.7823522-19673-246755000002825=/root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825 <<< 18823 1726855025.81883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855025.81888: stdout chunk (state=3): >>><<< 18823 1726855025.81897: stderr chunk (state=3): >>><<< 18823 1726855025.81982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855025.7823522-19673-246755000002825=/root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855025.82021: variable 'ansible_module_compression' from source: unknown 18823 1726855025.82203: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 18823 1726855025.82206: ANSIBALLZ: Acquiring lock 18823 1726855025.82212: ANSIBALLZ: Lock acquired: 140142269228544 18823 1726855025.82217: ANSIBALLZ: Creating module 18823 1726855026.26340: ANSIBALLZ: Writing module into payload 18823 1726855026.26759: ANSIBALLZ: Writing module 18823 1726855026.26767: ANSIBALLZ: Renaming module 18823 1726855026.26770: ANSIBALLZ: Done creating module 18823 1726855026.26952: variable 'ansible_facts' from source: unknown 18823 1726855026.27599: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/AnsiballZ_systemd.py 18823 1726855026.27644: Sending initial data 18823 1726855026.27648: Sent initial data (156 bytes) 18823 1726855026.28991: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855026.28995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855026.28998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855026.29001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855026.29006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855026.29009: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855026.29012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855026.29014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855026.29017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855026.29019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855026.29022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855026.29024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855026.29027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855026.29029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855026.29032: stderr chunk (state=3): >>>debug2: match found <<< 18823 1726855026.29034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855026.29151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855026.29177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855026.29395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855026.31212: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18823 1726855026.31234: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855026.31365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855026.31441: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpksjuuf66 /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/AnsiballZ_systemd.py <<< 18823 1726855026.31447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/AnsiballZ_systemd.py" <<< 18823 1726855026.31518: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpksjuuf66" to remote "/root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/AnsiballZ_systemd.py" <<< 18823 1726855026.35218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855026.35222: stdout chunk (state=3): >>><<< 18823 1726855026.35229: stderr chunk (state=3): >>><<< 18823 1726855026.35275: done transferring module to remote 18823 1726855026.35295: _low_level_execute_command(): starting 18823 1726855026.35405: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/ /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/AnsiballZ_systemd.py && sleep 0' 18823 1726855026.36667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855026.36704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855026.36718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855026.36732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855026.37033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855026.37229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855026.37303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855026.39102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855026.39133: stderr chunk (state=3): >>><<< 18823 1726855026.39144: stdout chunk (state=3): >>><<< 18823 1726855026.39180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855026.39183: _low_level_execute_command(): starting 18823 1726855026.39190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/AnsiballZ_systemd.py && sleep 0' 18823 1726855026.40334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855026.40428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855026.40431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855026.40434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855026.40454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855026.40457: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855026.40467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855026.40518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855026.40526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855026.40529: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855026.40550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855026.40632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855026.40635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855026.40637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855026.40639: stderr chunk (state=3): >>>debug2: match found <<< 18823 1726855026.40641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855026.40751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855026.41192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855026.41235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855026.41317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855026.70445: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317030912", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1132017000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "netw<<< 18823 1726855026.70523: stdout chunk (state=3): >>>ork-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18823 1726855026.72448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855026.72452: stdout chunk (state=3): >>><<< 18823 1726855026.72454: stderr chunk (state=3): >>><<< 18823 1726855026.72561: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317030912", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1132017000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855026.72847: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855026.72870: _low_level_execute_command(): starting 18823 1726855026.72882: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855025.7823522-19673-246755000002825/ > /dev/null 2>&1 && sleep 0' 18823 1726855026.73448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855026.73534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855026.73537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855026.73540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855026.73542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855026.73544: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855026.73546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855026.73584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855026.73597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855026.73608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855026.73696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855026.75649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855026.75652: stderr chunk (state=3): >>><<< 18823 1726855026.75655: stdout chunk (state=3): >>><<< 18823 1726855026.75657: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855026.75659: handler run complete 18823 1726855026.75920: attempt loop complete, returning result 18823 1726855026.75923: _execute() done 18823 1726855026.75925: dumping result to json 18823 1726855026.75927: done dumping result, returning 18823 1726855026.75929: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-d391-077c-000000000025] 18823 1726855026.75932: sending task result for task 0affcc66-ac2b-d391-077c-000000000025 18823 1726855026.78113: done sending task result for task 0affcc66-ac2b-d391-077c-000000000025 18823 1726855026.78117: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855026.78176: no more pending results, returning what we have 18823 1726855026.78183: results queue empty 18823 1726855026.78184: checking for any_errors_fatal 18823 1726855026.78192: done checking for any_errors_fatal 18823 1726855026.78193: checking for max_fail_percentage 18823 1726855026.78195: done checking for max_fail_percentage 18823 1726855026.78198: checking to see if all hosts have failed and the running result is not ok 18823 1726855026.78199: done checking to see if all hosts have failed 18823 1726855026.78199: getting the remaining hosts for this loop 18823 1726855026.78201: done getting the remaining hosts for this loop 18823 1726855026.78205: getting the next task for host managed_node2 18823 1726855026.78212: done getting next task for host managed_node2 18823 1726855026.78217: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18823 1726855026.78220: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855026.78231: getting variables 18823 1726855026.78233: in VariableManager get_vars() 18823 1726855026.78266: Calling all_inventory to load vars for managed_node2 18823 1726855026.78269: Calling groups_inventory to load vars for managed_node2 18823 1726855026.78272: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855026.78286: Calling all_plugins_play to load vars for managed_node2 18823 1726855026.78407: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855026.78413: Calling groups_plugins_play to load vars for managed_node2 18823 1726855026.79932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855026.81977: done with get_vars() 18823 1726855026.82002: done getting variables 18823 1726855026.82070: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:57:06 -0400 (0:00:01.264) 0:00:18.472 ****** 18823 1726855026.82112: entering _queue_task() for managed_node2/service 18823 1726855026.82462: worker is 1 (out of 1 available) 18823 1726855026.82594: exiting _queue_task() for managed_node2/service 18823 1726855026.82607: done queuing things up, now waiting for results queue to drain 18823 1726855026.82608: waiting for pending results... 18823 1726855026.82885: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18823 1726855026.82922: in run() - task 0affcc66-ac2b-d391-077c-000000000026 18823 1726855026.82935: variable 'ansible_search_path' from source: unknown 18823 1726855026.82940: variable 'ansible_search_path' from source: unknown 18823 1726855026.83194: calling self._execute() 18823 1726855026.83198: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855026.83201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855026.83204: variable 'omit' from source: magic vars 18823 1726855026.83540: variable 'ansible_distribution_major_version' from source: facts 18823 1726855026.83555: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855026.83678: variable 'network_provider' from source: set_fact 18823 1726855026.83682: Evaluated conditional (network_provider == "nm"): True 18823 1726855026.83791: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855026.83878: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855026.84059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855026.86710: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855026.86755: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855026.86790: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855026.86831: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855026.86857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855026.86995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855026.86999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855026.87001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855026.87052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855026.87076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855026.87144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855026.87159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855026.87184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855026.87231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855026.87249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855026.87294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855026.87322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855026.87358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855026.87614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855026.87617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855026.87620: variable 'network_connections' from source: play vars 18823 1726855026.87622: variable 'interface' from source: set_fact 18823 1726855026.87660: variable 'interface' from source: set_fact 18823 1726855026.87669: variable 'interface' from source: set_fact 18823 1726855026.87746: variable 'interface' from source: set_fact 18823 1726855026.87855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855026.88067: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855026.88111: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855026.88151: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855026.88179: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855026.88223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855026.88255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855026.88282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855026.88311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855026.88364: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855026.88695: variable 'network_connections' from source: play vars 18823 1726855026.88700: variable 'interface' from source: set_fact 18823 1726855026.88702: variable 'interface' from source: set_fact 18823 1726855026.88704: variable 'interface' from source: set_fact 18823 1726855026.88812: variable 'interface' from source: set_fact 18823 1726855026.88815: Evaluated conditional (__network_wpa_supplicant_required): False 18823 1726855026.88818: when evaluation is False, skipping this task 18823 1726855026.88820: _execute() done 18823 1726855026.88832: dumping result to json 18823 1726855026.88834: done dumping result, returning 18823 1726855026.88836: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-d391-077c-000000000026] 18823 1726855026.88838: sending task result for task 0affcc66-ac2b-d391-077c-000000000026 18823 1726855026.89081: done sending task result for task 0affcc66-ac2b-d391-077c-000000000026 18823 1726855026.89083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18823 1726855026.89138: no more pending results, returning what we have 18823 1726855026.89141: results queue empty 18823 1726855026.89142: checking for any_errors_fatal 18823 1726855026.89166: done checking for any_errors_fatal 18823 1726855026.89167: checking for max_fail_percentage 18823 1726855026.89169: done checking for max_fail_percentage 18823 1726855026.89169: checking to see if all hosts have failed and the running result is not ok 18823 1726855026.89170: done checking to see if all hosts have failed 18823 1726855026.89171: getting the remaining hosts for this loop 18823 1726855026.89173: done getting the remaining hosts for this loop 18823 1726855026.89178: getting the next task for host managed_node2 18823 1726855026.89191: done getting next task for host managed_node2 18823 1726855026.89195: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18823 1726855026.89197: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855026.89215: getting variables 18823 1726855026.89217: in VariableManager get_vars() 18823 1726855026.89263: Calling all_inventory to load vars for managed_node2 18823 1726855026.89266: Calling groups_inventory to load vars for managed_node2 18823 1726855026.89268: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855026.89278: Calling all_plugins_play to load vars for managed_node2 18823 1726855026.89280: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855026.89283: Calling groups_plugins_play to load vars for managed_node2 18823 1726855026.90990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855026.92923: done with get_vars() 18823 1726855026.92952: done getting variables 18823 1726855026.93044: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:57:06 -0400 (0:00:00.109) 0:00:18.582 ****** 18823 1726855026.93097: entering _queue_task() for managed_node2/service 18823 1726855026.93548: worker is 1 (out of 1 available) 18823 1726855026.93561: exiting _queue_task() for managed_node2/service 18823 1726855026.93574: done queuing things up, now waiting for results queue to drain 18823 1726855026.93575: waiting for pending results... 18823 1726855026.94030: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 18823 1726855026.94036: in run() - task 0affcc66-ac2b-d391-077c-000000000027 18823 1726855026.94038: variable 'ansible_search_path' from source: unknown 18823 1726855026.94040: variable 'ansible_search_path' from source: unknown 18823 1726855026.94044: calling self._execute() 18823 1726855026.94338: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855026.94361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855026.94404: variable 'omit' from source: magic vars 18823 1726855026.94983: variable 'ansible_distribution_major_version' from source: facts 18823 1726855026.95002: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855026.95119: variable 'network_provider' from source: set_fact 18823 1726855026.95124: Evaluated conditional (network_provider == "initscripts"): False 18823 1726855026.95127: when evaluation is False, skipping this task 18823 1726855026.95130: _execute() done 18823 1726855026.95135: dumping result to json 18823 1726855026.95137: done dumping result, returning 18823 1726855026.95146: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-d391-077c-000000000027] 18823 1726855026.95193: sending task result for task 0affcc66-ac2b-d391-077c-000000000027 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855026.95364: no more pending results, returning what we have 18823 1726855026.95368: results queue empty 18823 1726855026.95369: checking for any_errors_fatal 18823 1726855026.95375: done checking for any_errors_fatal 18823 1726855026.95376: checking for max_fail_percentage 18823 1726855026.95377: done checking for max_fail_percentage 18823 1726855026.95378: checking to see if all hosts have failed and the running result is not ok 18823 1726855026.95379: done checking to see if all hosts have failed 18823 1726855026.95379: getting the remaining hosts for this loop 18823 1726855026.95381: done getting the remaining hosts for this loop 18823 1726855026.95384: getting the next task for host managed_node2 18823 1726855026.95392: done getting next task for host managed_node2 18823 1726855026.95398: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18823 1726855026.95401: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855026.95413: getting variables 18823 1726855026.95414: in VariableManager get_vars() 18823 1726855026.95445: Calling all_inventory to load vars for managed_node2 18823 1726855026.95447: Calling groups_inventory to load vars for managed_node2 18823 1726855026.95449: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855026.95457: Calling all_plugins_play to load vars for managed_node2 18823 1726855026.95460: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855026.95462: Calling groups_plugins_play to load vars for managed_node2 18823 1726855026.96642: done sending task result for task 0affcc66-ac2b-d391-077c-000000000027 18823 1726855026.96646: WORKER PROCESS EXITING 18823 1726855026.97295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855026.99458: done with get_vars() 18823 1726855026.99513: done getting variables 18823 1726855026.99576: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:57:06 -0400 (0:00:00.065) 0:00:18.647 ****** 18823 1726855026.99619: entering _queue_task() for managed_node2/copy 18823 1726855026.99969: worker is 1 (out of 1 available) 18823 1726855026.99984: exiting _queue_task() for managed_node2/copy 18823 1726855026.99998: done queuing things up, now waiting for results queue to drain 18823 1726855027.00000: waiting for pending results... 18823 1726855027.00306: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18823 1726855027.00392: in run() - task 0affcc66-ac2b-d391-077c-000000000028 18823 1726855027.00413: variable 'ansible_search_path' from source: unknown 18823 1726855027.00422: variable 'ansible_search_path' from source: unknown 18823 1726855027.00574: calling self._execute() 18823 1726855027.00578: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855027.00581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855027.00584: variable 'omit' from source: magic vars 18823 1726855027.01175: variable 'ansible_distribution_major_version' from source: facts 18823 1726855027.01191: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855027.01317: variable 'network_provider' from source: set_fact 18823 1726855027.01327: Evaluated conditional (network_provider == "initscripts"): False 18823 1726855027.01334: when evaluation is False, skipping this task 18823 1726855027.01340: _execute() done 18823 1726855027.01347: dumping result to json 18823 1726855027.01354: done dumping result, returning 18823 1726855027.01405: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-d391-077c-000000000028] 18823 1726855027.01408: sending task result for task 0affcc66-ac2b-d391-077c-000000000028 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18823 1726855027.01553: no more pending results, returning what we have 18823 1726855027.01557: results queue empty 18823 1726855027.01558: checking for any_errors_fatal 18823 1726855027.01565: done checking for any_errors_fatal 18823 1726855027.01566: checking for max_fail_percentage 18823 1726855027.01568: done checking for max_fail_percentage 18823 1726855027.01570: checking to see if all hosts have failed and the running result is not ok 18823 1726855027.01570: done checking to see if all hosts have failed 18823 1726855027.01571: getting the remaining hosts for this loop 18823 1726855027.01573: done getting the remaining hosts for this loop 18823 1726855027.01576: getting the next task for host managed_node2 18823 1726855027.01584: done getting next task for host managed_node2 18823 1726855027.01590: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18823 1726855027.01593: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855027.01608: getting variables 18823 1726855027.01610: in VariableManager get_vars() 18823 1726855027.01649: Calling all_inventory to load vars for managed_node2 18823 1726855027.01652: Calling groups_inventory to load vars for managed_node2 18823 1726855027.01655: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855027.01668: Calling all_plugins_play to load vars for managed_node2 18823 1726855027.01671: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855027.01673: Calling groups_plugins_play to load vars for managed_node2 18823 1726855027.02312: done sending task result for task 0affcc66-ac2b-d391-077c-000000000028 18823 1726855027.02315: WORKER PROCESS EXITING 18823 1726855027.04205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855027.07102: done with get_vars() 18823 1726855027.07130: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:57:07 -0400 (0:00:00.075) 0:00:18.723 ****** 18823 1726855027.07217: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18823 1726855027.07219: Creating lock for fedora.linux_system_roles.network_connections 18823 1726855027.07626: worker is 1 (out of 1 available) 18823 1726855027.07742: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18823 1726855027.07868: done queuing things up, now waiting for results queue to drain 18823 1726855027.07869: waiting for pending results... 18823 1726855027.08043: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18823 1726855027.08158: in run() - task 0affcc66-ac2b-d391-077c-000000000029 18823 1726855027.08178: variable 'ansible_search_path' from source: unknown 18823 1726855027.08192: variable 'ansible_search_path' from source: unknown 18823 1726855027.08236: calling self._execute() 18823 1726855027.08345: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855027.08355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855027.08369: variable 'omit' from source: magic vars 18823 1726855027.08752: variable 'ansible_distribution_major_version' from source: facts 18823 1726855027.08768: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855027.08776: variable 'omit' from source: magic vars 18823 1726855027.08811: variable 'omit' from source: magic vars 18823 1726855027.08997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855027.11333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855027.11438: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855027.11895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855027.11897: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855027.11899: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855027.11949: variable 'network_provider' from source: set_fact 18823 1726855027.12274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855027.12936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855027.12970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855027.13019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855027.13111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855027.13202: variable 'omit' from source: magic vars 18823 1726855027.13692: variable 'omit' from source: magic vars 18823 1726855027.13695: variable 'network_connections' from source: play vars 18823 1726855027.13698: variable 'interface' from source: set_fact 18823 1726855027.13700: variable 'interface' from source: set_fact 18823 1726855027.13714: variable 'interface' from source: set_fact 18823 1726855027.13776: variable 'interface' from source: set_fact 18823 1726855027.13951: variable 'omit' from source: magic vars 18823 1726855027.13965: variable '__lsr_ansible_managed' from source: task vars 18823 1726855027.14036: variable '__lsr_ansible_managed' from source: task vars 18823 1726855027.14323: Loaded config def from plugin (lookup/template) 18823 1726855027.14341: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18823 1726855027.14377: File lookup term: get_ansible_managed.j2 18823 1726855027.14386: variable 'ansible_search_path' from source: unknown 18823 1726855027.14398: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18823 1726855027.14416: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18823 1726855027.14439: variable 'ansible_search_path' from source: unknown 18823 1726855027.21844: variable 'ansible_managed' from source: unknown 18823 1726855027.21979: variable 'omit' from source: magic vars 18823 1726855027.22218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855027.22249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855027.22322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855027.22345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855027.22359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855027.22411: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855027.22421: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855027.22429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855027.22642: Set connection var ansible_timeout to 10 18823 1726855027.22645: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855027.22648: Set connection var ansible_shell_type to sh 18823 1726855027.22650: Set connection var ansible_shell_executable to /bin/sh 18823 1726855027.22652: Set connection var ansible_connection to ssh 18823 1726855027.22653: Set connection var ansible_pipelining to False 18823 1726855027.22655: variable 'ansible_shell_executable' from source: unknown 18823 1726855027.22657: variable 'ansible_connection' from source: unknown 18823 1726855027.22659: variable 'ansible_module_compression' from source: unknown 18823 1726855027.22661: variable 'ansible_shell_type' from source: unknown 18823 1726855027.22664: variable 'ansible_shell_executable' from source: unknown 18823 1726855027.22666: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855027.22667: variable 'ansible_pipelining' from source: unknown 18823 1726855027.22669: variable 'ansible_timeout' from source: unknown 18823 1726855027.22671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855027.22852: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855027.22876: variable 'omit' from source: magic vars 18823 1726855027.22890: starting attempt loop 18823 1726855027.22898: running the handler 18823 1726855027.22917: _low_level_execute_command(): starting 18823 1726855027.22955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855027.24223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855027.24247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855027.24267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855027.24313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855027.24352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855027.24385: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855027.24468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855027.24522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855027.24609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855027.26346: stdout chunk (state=3): >>>/root <<< 18823 1726855027.26525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855027.26529: stdout chunk (state=3): >>><<< 18823 1726855027.26531: stderr chunk (state=3): >>><<< 18823 1726855027.26550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855027.26577: _low_level_execute_command(): starting 18823 1726855027.26596: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537 `" && echo ansible-tmp-1726855027.2655675-19738-23048770263537="` echo /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537 `" ) && sleep 0' 18823 1726855027.27536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855027.27550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855027.27566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855027.27584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855027.27609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855027.27656: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855027.27722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855027.27768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855027.27771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855027.27985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855027.29879: stdout chunk (state=3): >>>ansible-tmp-1726855027.2655675-19738-23048770263537=/root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537 <<< 18823 1726855027.30294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855027.30298: stdout chunk (state=3): >>><<< 18823 1726855027.30301: stderr chunk (state=3): >>><<< 18823 1726855027.30303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855027.2655675-19738-23048770263537=/root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855027.30305: variable 'ansible_module_compression' from source: unknown 18823 1726855027.30307: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 18823 1726855027.30310: ANSIBALLZ: Acquiring lock 18823 1726855027.30312: ANSIBALLZ: Lock acquired: 140142266071184 18823 1726855027.30314: ANSIBALLZ: Creating module 18823 1726855027.68586: ANSIBALLZ: Writing module into payload 18823 1726855027.69555: ANSIBALLZ: Writing module 18823 1726855027.69575: ANSIBALLZ: Renaming module 18823 1726855027.69579: ANSIBALLZ: Done creating module 18823 1726855027.69612: variable 'ansible_facts' from source: unknown 18823 1726855027.69890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/AnsiballZ_network_connections.py 18823 1726855027.70318: Sending initial data 18823 1726855027.70322: Sent initial data (167 bytes) 18823 1726855027.71319: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855027.71323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855027.71327: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855027.71370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855027.71498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855027.73251: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855027.73444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855027.73448: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpyo957q5m /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/AnsiballZ_network_connections.py <<< 18823 1726855027.73455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/AnsiballZ_network_connections.py" <<< 18823 1726855027.73528: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpyo957q5m" to remote "/root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/AnsiballZ_network_connections.py" <<< 18823 1726855027.76108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855027.76112: stdout chunk (state=3): >>><<< 18823 1726855027.76141: stderr chunk (state=3): >>><<< 18823 1726855027.76214: done transferring module to remote 18823 1726855027.76243: _low_level_execute_command(): starting 18823 1726855027.76246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/ /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/AnsiballZ_network_connections.py && sleep 0' 18823 1726855027.77203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855027.77293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855027.77297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855027.77315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855027.77469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855027.79308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855027.79312: stdout chunk (state=3): >>><<< 18823 1726855027.79314: stderr chunk (state=3): >>><<< 18823 1726855027.79337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855027.79346: _low_level_execute_command(): starting 18823 1726855027.79429: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/AnsiballZ_network_connections.py && sleep 0' 18823 1726855027.80011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855027.80026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855027.80040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855027.80060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855027.80078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855027.80102: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855027.80191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855027.80224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855027.80241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855027.80348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855028.26523: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18823 1726855028.28379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855028.28391: stdout chunk (state=3): >>><<< 18823 1726855028.28519: stderr chunk (state=3): >>><<< 18823 1726855028.28522: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855028.28582: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855028.28643: _low_level_execute_command(): starting 18823 1726855028.28653: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855027.2655675-19738-23048770263537/ > /dev/null 2>&1 && sleep 0' 18823 1726855028.29899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855028.29905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855028.29908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855028.29910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855028.29912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855028.29959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855028.30127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855028.30151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855028.32698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855028.32702: stdout chunk (state=3): >>><<< 18823 1726855028.32709: stderr chunk (state=3): >>><<< 18823 1726855028.32712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855028.32720: handler run complete 18823 1726855028.32722: attempt loop complete, returning result 18823 1726855028.32724: _execute() done 18823 1726855028.32726: dumping result to json 18823 1726855028.32729: done dumping result, returning 18823 1726855028.32731: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-d391-077c-000000000029] 18823 1726855028.32733: sending task result for task 0affcc66-ac2b-d391-077c-000000000029 changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757 [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757 (not-active) 18823 1726855028.33139: no more pending results, returning what we have 18823 1726855028.33143: results queue empty 18823 1726855028.33145: checking for any_errors_fatal 18823 1726855028.33153: done checking for any_errors_fatal 18823 1726855028.33154: checking for max_fail_percentage 18823 1726855028.33156: done checking for max_fail_percentage 18823 1726855028.33157: checking to see if all hosts have failed and the running result is not ok 18823 1726855028.33157: done checking to see if all hosts have failed 18823 1726855028.33158: getting the remaining hosts for this loop 18823 1726855028.33160: done getting the remaining hosts for this loop 18823 1726855028.33164: getting the next task for host managed_node2 18823 1726855028.33172: done getting next task for host managed_node2 18823 1726855028.33176: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18823 1726855028.33178: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855028.33522: done sending task result for task 0affcc66-ac2b-d391-077c-000000000029 18823 1726855028.33526: WORKER PROCESS EXITING 18823 1726855028.33532: getting variables 18823 1726855028.33533: in VariableManager get_vars() 18823 1726855028.33572: Calling all_inventory to load vars for managed_node2 18823 1726855028.33575: Calling groups_inventory to load vars for managed_node2 18823 1726855028.33577: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855028.33590: Calling all_plugins_play to load vars for managed_node2 18823 1726855028.33594: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855028.33597: Calling groups_plugins_play to load vars for managed_node2 18823 1726855028.36174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855028.37983: done with get_vars() 18823 1726855028.38015: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:57:08 -0400 (0:00:01.311) 0:00:20.034 ****** 18823 1726855028.38323: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18823 1726855028.38325: Creating lock for fedora.linux_system_roles.network_state 18823 1726855028.39492: worker is 1 (out of 1 available) 18823 1726855028.39500: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18823 1726855028.39513: done queuing things up, now waiting for results queue to drain 18823 1726855028.39514: waiting for pending results... 18823 1726855028.39665: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 18823 1726855028.39869: in run() - task 0affcc66-ac2b-d391-077c-00000000002a 18823 1726855028.39882: variable 'ansible_search_path' from source: unknown 18823 1726855028.39885: variable 'ansible_search_path' from source: unknown 18823 1726855028.39973: calling self._execute() 18823 1726855028.40177: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.40180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.40295: variable 'omit' from source: magic vars 18823 1726855028.41041: variable 'ansible_distribution_major_version' from source: facts 18823 1726855028.41045: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855028.41223: variable 'network_state' from source: role '' defaults 18823 1726855028.41234: Evaluated conditional (network_state != {}): False 18823 1726855028.41238: when evaluation is False, skipping this task 18823 1726855028.41241: _execute() done 18823 1726855028.41244: dumping result to json 18823 1726855028.41380: done dumping result, returning 18823 1726855028.41388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-d391-077c-00000000002a] 18823 1726855028.41394: sending task result for task 0affcc66-ac2b-d391-077c-00000000002a 18823 1726855028.41488: done sending task result for task 0affcc66-ac2b-d391-077c-00000000002a 18823 1726855028.41492: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855028.41554: no more pending results, returning what we have 18823 1726855028.41559: results queue empty 18823 1726855028.41561: checking for any_errors_fatal 18823 1726855028.41577: done checking for any_errors_fatal 18823 1726855028.41578: checking for max_fail_percentage 18823 1726855028.41580: done checking for max_fail_percentage 18823 1726855028.41581: checking to see if all hosts have failed and the running result is not ok 18823 1726855028.41582: done checking to see if all hosts have failed 18823 1726855028.41582: getting the remaining hosts for this loop 18823 1726855028.41584: done getting the remaining hosts for this loop 18823 1726855028.41594: getting the next task for host managed_node2 18823 1726855028.41601: done getting next task for host managed_node2 18823 1726855028.41606: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18823 1726855028.41609: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855028.41623: getting variables 18823 1726855028.41624: in VariableManager get_vars() 18823 1726855028.41658: Calling all_inventory to load vars for managed_node2 18823 1726855028.41661: Calling groups_inventory to load vars for managed_node2 18823 1726855028.41663: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855028.41674: Calling all_plugins_play to load vars for managed_node2 18823 1726855028.41677: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855028.41680: Calling groups_plugins_play to load vars for managed_node2 18823 1726855028.45337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855028.49008: done with get_vars() 18823 1726855028.49042: done getting variables 18823 1726855028.49268: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:57:08 -0400 (0:00:00.109) 0:00:20.144 ****** 18823 1726855028.49420: entering _queue_task() for managed_node2/debug 18823 1726855028.50277: worker is 1 (out of 1 available) 18823 1726855028.50308: exiting _queue_task() for managed_node2/debug 18823 1726855028.50320: done queuing things up, now waiting for results queue to drain 18823 1726855028.50321: waiting for pending results... 18823 1726855028.51007: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18823 1726855028.51012: in run() - task 0affcc66-ac2b-d391-077c-00000000002b 18823 1726855028.51015: variable 'ansible_search_path' from source: unknown 18823 1726855028.51017: variable 'ansible_search_path' from source: unknown 18823 1726855028.51020: calling self._execute() 18823 1726855028.51169: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.51174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.51177: variable 'omit' from source: magic vars 18823 1726855028.51850: variable 'ansible_distribution_major_version' from source: facts 18823 1726855028.51861: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855028.51868: variable 'omit' from source: magic vars 18823 1726855028.52011: variable 'omit' from source: magic vars 18823 1726855028.52146: variable 'omit' from source: magic vars 18823 1726855028.52191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855028.52280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855028.52369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855028.52389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855028.52401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855028.52434: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855028.52437: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.52440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.52709: Set connection var ansible_timeout to 10 18823 1726855028.52716: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855028.52719: Set connection var ansible_shell_type to sh 18823 1726855028.52728: Set connection var ansible_shell_executable to /bin/sh 18823 1726855028.52730: Set connection var ansible_connection to ssh 18823 1726855028.52733: Set connection var ansible_pipelining to False 18823 1726855028.52760: variable 'ansible_shell_executable' from source: unknown 18823 1726855028.52764: variable 'ansible_connection' from source: unknown 18823 1726855028.52767: variable 'ansible_module_compression' from source: unknown 18823 1726855028.52769: variable 'ansible_shell_type' from source: unknown 18823 1726855028.52772: variable 'ansible_shell_executable' from source: unknown 18823 1726855028.52774: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.52778: variable 'ansible_pipelining' from source: unknown 18823 1726855028.52781: variable 'ansible_timeout' from source: unknown 18823 1726855028.52785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.52976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855028.52986: variable 'omit' from source: magic vars 18823 1726855028.52994: starting attempt loop 18823 1726855028.52997: running the handler 18823 1726855028.53130: variable '__network_connections_result' from source: set_fact 18823 1726855028.53232: handler run complete 18823 1726855028.53243: attempt loop complete, returning result 18823 1726855028.53246: _execute() done 18823 1726855028.53249: dumping result to json 18823 1726855028.53252: done dumping result, returning 18823 1726855028.53262: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-d391-077c-00000000002b] 18823 1726855028.53269: sending task result for task 0affcc66-ac2b-d391-077c-00000000002b ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757 (not-active)" ] } 18823 1726855028.53736: no more pending results, returning what we have 18823 1726855028.53739: results queue empty 18823 1726855028.53740: checking for any_errors_fatal 18823 1726855028.53745: done checking for any_errors_fatal 18823 1726855028.53746: checking for max_fail_percentage 18823 1726855028.53748: done checking for max_fail_percentage 18823 1726855028.53749: checking to see if all hosts have failed and the running result is not ok 18823 1726855028.53750: done checking to see if all hosts have failed 18823 1726855028.53750: getting the remaining hosts for this loop 18823 1726855028.53752: done getting the remaining hosts for this loop 18823 1726855028.53755: getting the next task for host managed_node2 18823 1726855028.53760: done getting next task for host managed_node2 18823 1726855028.53764: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18823 1726855028.53766: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855028.53775: getting variables 18823 1726855028.53776: in VariableManager get_vars() 18823 1726855028.53816: Calling all_inventory to load vars for managed_node2 18823 1726855028.53818: Calling groups_inventory to load vars for managed_node2 18823 1726855028.53821: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855028.53830: Calling all_plugins_play to load vars for managed_node2 18823 1726855028.53832: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855028.53835: Calling groups_plugins_play to load vars for managed_node2 18823 1726855028.54401: done sending task result for task 0affcc66-ac2b-d391-077c-00000000002b 18823 1726855028.54408: WORKER PROCESS EXITING 18823 1726855028.56643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855028.58923: done with get_vars() 18823 1726855028.58980: done getting variables 18823 1726855028.59049: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:57:08 -0400 (0:00:00.096) 0:00:20.242 ****** 18823 1726855028.59182: entering _queue_task() for managed_node2/debug 18823 1726855028.60106: worker is 1 (out of 1 available) 18823 1726855028.60119: exiting _queue_task() for managed_node2/debug 18823 1726855028.60132: done queuing things up, now waiting for results queue to drain 18823 1726855028.60133: waiting for pending results... 18823 1726855028.60520: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18823 1726855028.60946: in run() - task 0affcc66-ac2b-d391-077c-00000000002c 18823 1726855028.60950: variable 'ansible_search_path' from source: unknown 18823 1726855028.60953: variable 'ansible_search_path' from source: unknown 18823 1726855028.60956: calling self._execute() 18823 1726855028.61272: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.61276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.61279: variable 'omit' from source: magic vars 18823 1726855028.62077: variable 'ansible_distribution_major_version' from source: facts 18823 1726855028.62294: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855028.62298: variable 'omit' from source: magic vars 18823 1726855028.62300: variable 'omit' from source: magic vars 18823 1726855028.62541: variable 'omit' from source: magic vars 18823 1726855028.62545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855028.62548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855028.62658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855028.62760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855028.62763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855028.62766: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855028.62769: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.62772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.62956: Set connection var ansible_timeout to 10 18823 1726855028.62994: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855028.63196: Set connection var ansible_shell_type to sh 18823 1726855028.63199: Set connection var ansible_shell_executable to /bin/sh 18823 1726855028.63202: Set connection var ansible_connection to ssh 18823 1726855028.63279: Set connection var ansible_pipelining to False 18823 1726855028.63283: variable 'ansible_shell_executable' from source: unknown 18823 1726855028.63285: variable 'ansible_connection' from source: unknown 18823 1726855028.63290: variable 'ansible_module_compression' from source: unknown 18823 1726855028.63292: variable 'ansible_shell_type' from source: unknown 18823 1726855028.63295: variable 'ansible_shell_executable' from source: unknown 18823 1726855028.63302: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.63306: variable 'ansible_pipelining' from source: unknown 18823 1726855028.63309: variable 'ansible_timeout' from source: unknown 18823 1726855028.63311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.63740: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855028.63745: variable 'omit' from source: magic vars 18823 1726855028.63747: starting attempt loop 18823 1726855028.63758: running the handler 18823 1726855028.64067: variable '__network_connections_result' from source: set_fact 18823 1726855028.64122: variable '__network_connections_result' from source: set_fact 18823 1726855028.64383: handler run complete 18823 1726855028.64443: attempt loop complete, returning result 18823 1726855028.64508: _execute() done 18823 1726855028.64525: dumping result to json 18823 1726855028.64535: done dumping result, returning 18823 1726855028.64549: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-d391-077c-00000000002c] 18823 1726855028.64631: sending task result for task 0affcc66-ac2b-d391-077c-00000000002c ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, e1c880fd-fdb5-4526-932f-74f695fa6757 (not-active)" ] } } 18823 1726855028.65034: no more pending results, returning what we have 18823 1726855028.65038: results queue empty 18823 1726855028.65039: checking for any_errors_fatal 18823 1726855028.65045: done checking for any_errors_fatal 18823 1726855028.65046: checking for max_fail_percentage 18823 1726855028.65049: done checking for max_fail_percentage 18823 1726855028.65050: checking to see if all hosts have failed and the running result is not ok 18823 1726855028.65050: done checking to see if all hosts have failed 18823 1726855028.65051: getting the remaining hosts for this loop 18823 1726855028.65053: done getting the remaining hosts for this loop 18823 1726855028.65057: getting the next task for host managed_node2 18823 1726855028.65064: done getting next task for host managed_node2 18823 1726855028.65068: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18823 1726855028.65070: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855028.65081: getting variables 18823 1726855028.65083: in VariableManager get_vars() 18823 1726855028.65441: Calling all_inventory to load vars for managed_node2 18823 1726855028.65444: Calling groups_inventory to load vars for managed_node2 18823 1726855028.65447: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855028.65458: Calling all_plugins_play to load vars for managed_node2 18823 1726855028.65461: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855028.65463: Calling groups_plugins_play to load vars for managed_node2 18823 1726855028.66106: done sending task result for task 0affcc66-ac2b-d391-077c-00000000002c 18823 1726855028.66110: WORKER PROCESS EXITING 18823 1726855028.68824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855028.72623: done with get_vars() 18823 1726855028.72648: done getting variables 18823 1726855028.72846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:57:08 -0400 (0:00:00.137) 0:00:20.380 ****** 18823 1726855028.72877: entering _queue_task() for managed_node2/debug 18823 1726855028.73832: worker is 1 (out of 1 available) 18823 1726855028.73845: exiting _queue_task() for managed_node2/debug 18823 1726855028.73855: done queuing things up, now waiting for results queue to drain 18823 1726855028.73856: waiting for pending results... 18823 1726855028.74210: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18823 1726855028.74396: in run() - task 0affcc66-ac2b-d391-077c-00000000002d 18823 1726855028.74467: variable 'ansible_search_path' from source: unknown 18823 1726855028.74475: variable 'ansible_search_path' from source: unknown 18823 1726855028.74540: calling self._execute() 18823 1726855028.74764: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.74886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.74891: variable 'omit' from source: magic vars 18823 1726855028.75789: variable 'ansible_distribution_major_version' from source: facts 18823 1726855028.75920: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855028.76201: variable 'network_state' from source: role '' defaults 18823 1726855028.76220: Evaluated conditional (network_state != {}): False 18823 1726855028.76228: when evaluation is False, skipping this task 18823 1726855028.76236: _execute() done 18823 1726855028.76249: dumping result to json 18823 1726855028.76257: done dumping result, returning 18823 1726855028.76356: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-d391-077c-00000000002d] 18823 1726855028.76360: sending task result for task 0affcc66-ac2b-d391-077c-00000000002d skipping: [managed_node2] => { "false_condition": "network_state != {}" } 18823 1726855028.76645: no more pending results, returning what we have 18823 1726855028.76651: results queue empty 18823 1726855028.76652: checking for any_errors_fatal 18823 1726855028.76664: done checking for any_errors_fatal 18823 1726855028.76665: checking for max_fail_percentage 18823 1726855028.76667: done checking for max_fail_percentage 18823 1726855028.76668: checking to see if all hosts have failed and the running result is not ok 18823 1726855028.76669: done checking to see if all hosts have failed 18823 1726855028.76670: getting the remaining hosts for this loop 18823 1726855028.76671: done getting the remaining hosts for this loop 18823 1726855028.76676: getting the next task for host managed_node2 18823 1726855028.76683: done getting next task for host managed_node2 18823 1726855028.76690: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18823 1726855028.76694: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855028.76713: getting variables 18823 1726855028.76715: in VariableManager get_vars() 18823 1726855028.76756: Calling all_inventory to load vars for managed_node2 18823 1726855028.76759: Calling groups_inventory to load vars for managed_node2 18823 1726855028.76762: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855028.76776: Calling all_plugins_play to load vars for managed_node2 18823 1726855028.76779: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855028.76782: Calling groups_plugins_play to load vars for managed_node2 18823 1726855028.78069: done sending task result for task 0affcc66-ac2b-d391-077c-00000000002d 18823 1726855028.78073: WORKER PROCESS EXITING 18823 1726855028.80117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855028.83738: done with get_vars() 18823 1726855028.83890: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:57:08 -0400 (0:00:00.113) 0:00:20.493 ****** 18823 1726855028.84220: entering _queue_task() for managed_node2/ping 18823 1726855028.84222: Creating lock for ping 18823 1726855028.84999: worker is 1 (out of 1 available) 18823 1726855028.85011: exiting _queue_task() for managed_node2/ping 18823 1726855028.85025: done queuing things up, now waiting for results queue to drain 18823 1726855028.85026: waiting for pending results... 18823 1726855028.85581: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 18823 1726855028.85677: in run() - task 0affcc66-ac2b-d391-077c-00000000002e 18823 1726855028.86124: variable 'ansible_search_path' from source: unknown 18823 1726855028.86131: variable 'ansible_search_path' from source: unknown 18823 1726855028.86142: calling self._execute() 18823 1726855028.86336: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.86340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.86355: variable 'omit' from source: magic vars 18823 1726855028.87431: variable 'ansible_distribution_major_version' from source: facts 18823 1726855028.87443: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855028.87451: variable 'omit' from source: magic vars 18823 1726855028.87896: variable 'omit' from source: magic vars 18823 1726855028.87991: variable 'omit' from source: magic vars 18823 1726855028.87994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855028.88223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855028.88246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855028.88264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855028.88276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855028.88515: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855028.88519: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.88521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.88626: Set connection var ansible_timeout to 10 18823 1726855028.88693: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855028.88696: Set connection var ansible_shell_type to sh 18823 1726855028.88698: Set connection var ansible_shell_executable to /bin/sh 18823 1726855028.88699: Set connection var ansible_connection to ssh 18823 1726855028.88701: Set connection var ansible_pipelining to False 18823 1726855028.88702: variable 'ansible_shell_executable' from source: unknown 18823 1726855028.88704: variable 'ansible_connection' from source: unknown 18823 1726855028.88706: variable 'ansible_module_compression' from source: unknown 18823 1726855028.88708: variable 'ansible_shell_type' from source: unknown 18823 1726855028.88710: variable 'ansible_shell_executable' from source: unknown 18823 1726855028.88713: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855028.88714: variable 'ansible_pipelining' from source: unknown 18823 1726855028.88716: variable 'ansible_timeout' from source: unknown 18823 1726855028.89101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855028.89595: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855028.89607: variable 'omit' from source: magic vars 18823 1726855028.89624: starting attempt loop 18823 1726855028.89627: running the handler 18823 1726855028.89629: _low_level_execute_command(): starting 18823 1726855028.89693: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855028.90915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855028.90922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855028.91406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855028.91430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855028.93135: stdout chunk (state=3): >>>/root <<< 18823 1726855028.93247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855028.93333: stderr chunk (state=3): >>><<< 18823 1726855028.93337: stdout chunk (state=3): >>><<< 18823 1726855028.93361: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855028.93373: _low_level_execute_command(): starting 18823 1726855028.93380: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562 `" && echo ansible-tmp-1726855028.933585-19820-118713384912562="` echo /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562 `" ) && sleep 0' 18823 1726855028.94093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855028.94098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855028.94100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855028.94108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855028.94129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855028.94132: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855028.94176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855028.94191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855028.94195: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855028.94197: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855028.94200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855028.94202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855028.94238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855028.94293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855028.94300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855028.94316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855028.94423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855028.96352: stdout chunk (state=3): >>>ansible-tmp-1726855028.933585-19820-118713384912562=/root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562 <<< 18823 1726855028.96524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855028.96527: stdout chunk (state=3): >>><<< 18823 1726855028.96530: stderr chunk (state=3): >>><<< 18823 1726855028.96674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855028.933585-19820-118713384912562=/root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855028.96678: variable 'ansible_module_compression' from source: unknown 18823 1726855028.96738: ANSIBALLZ: Using lock for ping 18823 1726855028.96773: ANSIBALLZ: Acquiring lock 18823 1726855028.96792: ANSIBALLZ: Lock acquired: 140142262154640 18823 1726855028.96849: ANSIBALLZ: Creating module 18823 1726855029.21035: ANSIBALLZ: Writing module into payload 18823 1726855029.21101: ANSIBALLZ: Writing module 18823 1726855029.21142: ANSIBALLZ: Renaming module 18823 1726855029.21146: ANSIBALLZ: Done creating module 18823 1726855029.21251: variable 'ansible_facts' from source: unknown 18823 1726855029.21255: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/AnsiballZ_ping.py 18823 1726855029.21493: Sending initial data 18823 1726855029.21497: Sent initial data (152 bytes) 18823 1726855029.21953: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855029.21959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855029.21994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.21997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855029.22000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855029.22002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.22060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855029.22063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855029.22067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855029.22145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855029.23881: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855029.23914: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855029.23998: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpcv_tdvxi /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/AnsiballZ_ping.py <<< 18823 1726855029.24006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/AnsiballZ_ping.py" <<< 18823 1726855029.24059: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpcv_tdvxi" to remote "/root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/AnsiballZ_ping.py" <<< 18823 1726855029.24992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855029.24995: stderr chunk (state=3): >>><<< 18823 1726855029.24998: stdout chunk (state=3): >>><<< 18823 1726855029.25012: done transferring module to remote 18823 1726855029.25028: _low_level_execute_command(): starting 18823 1726855029.25035: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/ /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/AnsiballZ_ping.py && sleep 0' 18823 1726855029.25720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855029.25727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855029.25751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.25756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855029.25762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855029.25779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855029.25793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.25839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855029.25853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855029.25954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855029.27826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855029.27829: stdout chunk (state=3): >>><<< 18823 1726855029.27831: stderr chunk (state=3): >>><<< 18823 1726855029.27843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855029.27939: _low_level_execute_command(): starting 18823 1726855029.27942: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/AnsiballZ_ping.py && sleep 0' 18823 1726855029.28503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.28542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855029.28552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855029.28569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855029.28664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855029.43617: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18823 1726855029.44783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855029.44814: stderr chunk (state=3): >>><<< 18823 1726855029.44817: stdout chunk (state=3): >>><<< 18823 1726855029.44834: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855029.44857: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855029.44865: _low_level_execute_command(): starting 18823 1726855029.44869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855028.933585-19820-118713384912562/ > /dev/null 2>&1 && sleep 0' 18823 1726855029.45322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855029.45325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855029.45328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855029.45330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.45385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855029.45399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855029.45401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855029.45461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855029.47316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855029.47340: stderr chunk (state=3): >>><<< 18823 1726855029.47343: stdout chunk (state=3): >>><<< 18823 1726855029.47357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855029.47363: handler run complete 18823 1726855029.47374: attempt loop complete, returning result 18823 1726855029.47377: _execute() done 18823 1726855029.47380: dumping result to json 18823 1726855029.47384: done dumping result, returning 18823 1726855029.47394: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-d391-077c-00000000002e] 18823 1726855029.47399: sending task result for task 0affcc66-ac2b-d391-077c-00000000002e 18823 1726855029.47484: done sending task result for task 0affcc66-ac2b-d391-077c-00000000002e 18823 1726855029.47487: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 18823 1726855029.47546: no more pending results, returning what we have 18823 1726855029.47549: results queue empty 18823 1726855029.47550: checking for any_errors_fatal 18823 1726855029.47557: done checking for any_errors_fatal 18823 1726855029.47558: checking for max_fail_percentage 18823 1726855029.47559: done checking for max_fail_percentage 18823 1726855029.47560: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.47561: done checking to see if all hosts have failed 18823 1726855029.47561: getting the remaining hosts for this loop 18823 1726855029.47563: done getting the remaining hosts for this loop 18823 1726855029.47566: getting the next task for host managed_node2 18823 1726855029.47575: done getting next task for host managed_node2 18823 1726855029.47577: ^ task is: TASK: meta (role_complete) 18823 1726855029.47579: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.47590: getting variables 18823 1726855029.47592: in VariableManager get_vars() 18823 1726855029.47632: Calling all_inventory to load vars for managed_node2 18823 1726855029.47635: Calling groups_inventory to load vars for managed_node2 18823 1726855029.47637: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.47648: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.47651: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.47653: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.48621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.49479: done with get_vars() 18823 1726855029.49499: done getting variables 18823 1726855029.49561: done queuing things up, now waiting for results queue to drain 18823 1726855029.49563: results queue empty 18823 1726855029.49563: checking for any_errors_fatal 18823 1726855029.49565: done checking for any_errors_fatal 18823 1726855029.49565: checking for max_fail_percentage 18823 1726855029.49566: done checking for max_fail_percentage 18823 1726855029.49566: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.49567: done checking to see if all hosts have failed 18823 1726855029.49567: getting the remaining hosts for this loop 18823 1726855029.49568: done getting the remaining hosts for this loop 18823 1726855029.49570: getting the next task for host managed_node2 18823 1726855029.49572: done getting next task for host managed_node2 18823 1726855029.49574: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18823 1726855029.49575: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.49576: getting variables 18823 1726855029.49577: in VariableManager get_vars() 18823 1726855029.49585: Calling all_inventory to load vars for managed_node2 18823 1726855029.49588: Calling groups_inventory to load vars for managed_node2 18823 1726855029.49590: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.49593: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.49595: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.49596: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.50233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.51100: done with get_vars() 18823 1726855029.51117: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 13:57:09 -0400 (0:00:00.669) 0:00:21.163 ****** 18823 1726855029.51195: entering _queue_task() for managed_node2/include_tasks 18823 1726855029.51452: worker is 1 (out of 1 available) 18823 1726855029.51466: exiting _queue_task() for managed_node2/include_tasks 18823 1726855029.51477: done queuing things up, now waiting for results queue to drain 18823 1726855029.51478: waiting for pending results... 18823 1726855029.51649: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18823 1726855029.51722: in run() - task 0affcc66-ac2b-d391-077c-000000000030 18823 1726855029.51732: variable 'ansible_search_path' from source: unknown 18823 1726855029.51760: calling self._execute() 18823 1726855029.51834: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.51838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.51848: variable 'omit' from source: magic vars 18823 1726855029.52114: variable 'ansible_distribution_major_version' from source: facts 18823 1726855029.52124: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855029.52130: _execute() done 18823 1726855029.52135: dumping result to json 18823 1726855029.52138: done dumping result, returning 18823 1726855029.52145: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0affcc66-ac2b-d391-077c-000000000030] 18823 1726855029.52148: sending task result for task 0affcc66-ac2b-d391-077c-000000000030 18823 1726855029.52232: done sending task result for task 0affcc66-ac2b-d391-077c-000000000030 18823 1726855029.52235: WORKER PROCESS EXITING 18823 1726855029.52272: no more pending results, returning what we have 18823 1726855029.52277: in VariableManager get_vars() 18823 1726855029.52321: Calling all_inventory to load vars for managed_node2 18823 1726855029.52324: Calling groups_inventory to load vars for managed_node2 18823 1726855029.52326: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.52338: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.52340: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.52343: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.53407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.54768: done with get_vars() 18823 1726855029.54782: variable 'ansible_search_path' from source: unknown 18823 1726855029.54795: we have included files to process 18823 1726855029.54796: generating all_blocks data 18823 1726855029.54798: done generating all_blocks data 18823 1726855029.54801: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18823 1726855029.54801: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18823 1726855029.54805: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18823 1726855029.55059: done processing included file 18823 1726855029.55062: iterating over new_blocks loaded from include file 18823 1726855029.55065: in VariableManager get_vars() 18823 1726855029.55086: done with get_vars() 18823 1726855029.55090: filtering new block on tags 18823 1726855029.55115: done filtering new block on tags 18823 1726855029.55118: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node2 18823 1726855029.55129: extending task lists for all hosts with included blocks 18823 1726855029.55172: done extending task lists 18823 1726855029.55173: done processing included files 18823 1726855029.55176: results queue empty 18823 1726855029.55177: checking for any_errors_fatal 18823 1726855029.55180: done checking for any_errors_fatal 18823 1726855029.55181: checking for max_fail_percentage 18823 1726855029.55182: done checking for max_fail_percentage 18823 1726855029.55183: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.55186: done checking to see if all hosts have failed 18823 1726855029.55188: getting the remaining hosts for this loop 18823 1726855029.55189: done getting the remaining hosts for this loop 18823 1726855029.55193: getting the next task for host managed_node2 18823 1726855029.55202: done getting next task for host managed_node2 18823 1726855029.55206: ^ task is: TASK: Assert that warnings is empty 18823 1726855029.55209: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.55213: getting variables 18823 1726855029.55214: in VariableManager get_vars() 18823 1726855029.55228: Calling all_inventory to load vars for managed_node2 18823 1726855029.55230: Calling groups_inventory to load vars for managed_node2 18823 1726855029.55232: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.55243: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.55246: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.55250: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.56080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.57354: done with get_vars() 18823 1726855029.57376: done getting variables 18823 1726855029.57421: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Friday 20 September 2024 13:57:09 -0400 (0:00:00.062) 0:00:21.226 ****** 18823 1726855029.57451: entering _queue_task() for managed_node2/assert 18823 1726855029.57774: worker is 1 (out of 1 available) 18823 1726855029.57791: exiting _queue_task() for managed_node2/assert 18823 1726855029.57802: done queuing things up, now waiting for results queue to drain 18823 1726855029.57803: waiting for pending results... 18823 1726855029.58208: running TaskExecutor() for managed_node2/TASK: Assert that warnings is empty 18823 1726855029.58213: in run() - task 0affcc66-ac2b-d391-077c-000000000304 18823 1726855029.58216: variable 'ansible_search_path' from source: unknown 18823 1726855029.58218: variable 'ansible_search_path' from source: unknown 18823 1726855029.58224: calling self._execute() 18823 1726855029.58317: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.58327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.58340: variable 'omit' from source: magic vars 18823 1726855029.58704: variable 'ansible_distribution_major_version' from source: facts 18823 1726855029.58722: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855029.58739: variable 'omit' from source: magic vars 18823 1726855029.58790: variable 'omit' from source: magic vars 18823 1726855029.58835: variable 'omit' from source: magic vars 18823 1726855029.58875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855029.58907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855029.58924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855029.58937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855029.58960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855029.58976: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855029.58979: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.58982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.59052: Set connection var ansible_timeout to 10 18823 1726855029.59057: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855029.59060: Set connection var ansible_shell_type to sh 18823 1726855029.59066: Set connection var ansible_shell_executable to /bin/sh 18823 1726855029.59070: Set connection var ansible_connection to ssh 18823 1726855029.59075: Set connection var ansible_pipelining to False 18823 1726855029.59097: variable 'ansible_shell_executable' from source: unknown 18823 1726855029.59100: variable 'ansible_connection' from source: unknown 18823 1726855029.59106: variable 'ansible_module_compression' from source: unknown 18823 1726855029.59108: variable 'ansible_shell_type' from source: unknown 18823 1726855029.59111: variable 'ansible_shell_executable' from source: unknown 18823 1726855029.59113: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.59115: variable 'ansible_pipelining' from source: unknown 18823 1726855029.59117: variable 'ansible_timeout' from source: unknown 18823 1726855029.59119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.59220: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855029.59228: variable 'omit' from source: magic vars 18823 1726855029.59234: starting attempt loop 18823 1726855029.59237: running the handler 18823 1726855029.59328: variable '__network_connections_result' from source: set_fact 18823 1726855029.59339: Evaluated conditional ('warnings' not in __network_connections_result): True 18823 1726855029.59344: handler run complete 18823 1726855029.59358: attempt loop complete, returning result 18823 1726855029.59361: _execute() done 18823 1726855029.59363: dumping result to json 18823 1726855029.59365: done dumping result, returning 18823 1726855029.59368: done running TaskExecutor() for managed_node2/TASK: Assert that warnings is empty [0affcc66-ac2b-d391-077c-000000000304] 18823 1726855029.59374: sending task result for task 0affcc66-ac2b-d391-077c-000000000304 18823 1726855029.59453: done sending task result for task 0affcc66-ac2b-d391-077c-000000000304 18823 1726855029.59455: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18823 1726855029.59543: no more pending results, returning what we have 18823 1726855029.59547: results queue empty 18823 1726855029.59548: checking for any_errors_fatal 18823 1726855029.59549: done checking for any_errors_fatal 18823 1726855029.59550: checking for max_fail_percentage 18823 1726855029.59551: done checking for max_fail_percentage 18823 1726855029.59552: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.59553: done checking to see if all hosts have failed 18823 1726855029.59554: getting the remaining hosts for this loop 18823 1726855029.59555: done getting the remaining hosts for this loop 18823 1726855029.59558: getting the next task for host managed_node2 18823 1726855029.59563: done getting next task for host managed_node2 18823 1726855029.59565: ^ task is: TASK: Assert that there is output in stderr 18823 1726855029.59568: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.59573: getting variables 18823 1726855029.59575: in VariableManager get_vars() 18823 1726855029.59609: Calling all_inventory to load vars for managed_node2 18823 1726855029.59612: Calling groups_inventory to load vars for managed_node2 18823 1726855029.59614: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.59623: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.59625: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.59627: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.65085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.65960: done with get_vars() 18823 1726855029.65975: done getting variables 18823 1726855029.66013: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Friday 20 September 2024 13:57:09 -0400 (0:00:00.085) 0:00:21.312 ****** 18823 1726855029.66032: entering _queue_task() for managed_node2/assert 18823 1726855029.66283: worker is 1 (out of 1 available) 18823 1726855029.66300: exiting _queue_task() for managed_node2/assert 18823 1726855029.66311: done queuing things up, now waiting for results queue to drain 18823 1726855029.66312: waiting for pending results... 18823 1726855029.66481: running TaskExecutor() for managed_node2/TASK: Assert that there is output in stderr 18823 1726855029.66569: in run() - task 0affcc66-ac2b-d391-077c-000000000305 18823 1726855029.66580: variable 'ansible_search_path' from source: unknown 18823 1726855029.66583: variable 'ansible_search_path' from source: unknown 18823 1726855029.66626: calling self._execute() 18823 1726855029.66761: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.66765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.66768: variable 'omit' from source: magic vars 18823 1726855029.67118: variable 'ansible_distribution_major_version' from source: facts 18823 1726855029.67122: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855029.67125: variable 'omit' from source: magic vars 18823 1726855029.67152: variable 'omit' from source: magic vars 18823 1726855029.67190: variable 'omit' from source: magic vars 18823 1726855029.67318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855029.67322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855029.67325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855029.67327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855029.67330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855029.67333: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855029.67335: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.67343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.67436: Set connection var ansible_timeout to 10 18823 1726855029.67441: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855029.67445: Set connection var ansible_shell_type to sh 18823 1726855029.67448: Set connection var ansible_shell_executable to /bin/sh 18823 1726855029.67455: Set connection var ansible_connection to ssh 18823 1726855029.67460: Set connection var ansible_pipelining to False 18823 1726855029.67486: variable 'ansible_shell_executable' from source: unknown 18823 1726855029.67492: variable 'ansible_connection' from source: unknown 18823 1726855029.67495: variable 'ansible_module_compression' from source: unknown 18823 1726855029.67498: variable 'ansible_shell_type' from source: unknown 18823 1726855029.67500: variable 'ansible_shell_executable' from source: unknown 18823 1726855029.67502: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.67507: variable 'ansible_pipelining' from source: unknown 18823 1726855029.67509: variable 'ansible_timeout' from source: unknown 18823 1726855029.67511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.67702: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855029.67709: variable 'omit' from source: magic vars 18823 1726855029.67711: starting attempt loop 18823 1726855029.67714: running the handler 18823 1726855029.67778: variable '__network_connections_result' from source: set_fact 18823 1726855029.67809: Evaluated conditional ('stderr' in __network_connections_result): True 18823 1726855029.67812: handler run complete 18823 1726855029.67815: attempt loop complete, returning result 18823 1726855029.67817: _execute() done 18823 1726855029.67819: dumping result to json 18823 1726855029.67822: done dumping result, returning 18823 1726855029.67824: done running TaskExecutor() for managed_node2/TASK: Assert that there is output in stderr [0affcc66-ac2b-d391-077c-000000000305] 18823 1726855029.67826: sending task result for task 0affcc66-ac2b-d391-077c-000000000305 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18823 1726855029.68019: no more pending results, returning what we have 18823 1726855029.68023: results queue empty 18823 1726855029.68024: checking for any_errors_fatal 18823 1726855029.68031: done checking for any_errors_fatal 18823 1726855029.68032: checking for max_fail_percentage 18823 1726855029.68033: done checking for max_fail_percentage 18823 1726855029.68034: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.68035: done checking to see if all hosts have failed 18823 1726855029.68035: getting the remaining hosts for this loop 18823 1726855029.68037: done getting the remaining hosts for this loop 18823 1726855029.68040: getting the next task for host managed_node2 18823 1726855029.68047: done getting next task for host managed_node2 18823 1726855029.68049: ^ task is: TASK: meta (flush_handlers) 18823 1726855029.68055: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.68058: getting variables 18823 1726855029.68060: in VariableManager get_vars() 18823 1726855029.68090: Calling all_inventory to load vars for managed_node2 18823 1726855029.68093: Calling groups_inventory to load vars for managed_node2 18823 1726855029.68095: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.68107: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.68109: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.68112: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.68682: done sending task result for task 0affcc66-ac2b-d391-077c-000000000305 18823 1726855029.68686: WORKER PROCESS EXITING 18823 1726855029.68852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.69862: done with get_vars() 18823 1726855029.69876: done getting variables 18823 1726855029.69957: in VariableManager get_vars() 18823 1726855029.69968: Calling all_inventory to load vars for managed_node2 18823 1726855029.69970: Calling groups_inventory to load vars for managed_node2 18823 1726855029.69972: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.69976: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.69978: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.69980: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.71676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.73604: done with get_vars() 18823 1726855029.73641: done queuing things up, now waiting for results queue to drain 18823 1726855029.73643: results queue empty 18823 1726855029.73644: checking for any_errors_fatal 18823 1726855029.73647: done checking for any_errors_fatal 18823 1726855029.73648: checking for max_fail_percentage 18823 1726855029.73649: done checking for max_fail_percentage 18823 1726855029.73650: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.73651: done checking to see if all hosts have failed 18823 1726855029.73651: getting the remaining hosts for this loop 18823 1726855029.73659: done getting the remaining hosts for this loop 18823 1726855029.73662: getting the next task for host managed_node2 18823 1726855029.73666: done getting next task for host managed_node2 18823 1726855029.73667: ^ task is: TASK: meta (flush_handlers) 18823 1726855029.73669: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.73671: getting variables 18823 1726855029.73672: in VariableManager get_vars() 18823 1726855029.73686: Calling all_inventory to load vars for managed_node2 18823 1726855029.73690: Calling groups_inventory to load vars for managed_node2 18823 1726855029.73692: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.73698: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.73701: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.73703: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.75272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.77407: done with get_vars() 18823 1726855029.77429: done getting variables 18823 1726855029.77477: in VariableManager get_vars() 18823 1726855029.77491: Calling all_inventory to load vars for managed_node2 18823 1726855029.77493: Calling groups_inventory to load vars for managed_node2 18823 1726855029.77495: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.77499: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.77501: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.77504: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.78656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.80211: done with get_vars() 18823 1726855029.80238: done queuing things up, now waiting for results queue to drain 18823 1726855029.80241: results queue empty 18823 1726855029.80241: checking for any_errors_fatal 18823 1726855029.80243: done checking for any_errors_fatal 18823 1726855029.80243: checking for max_fail_percentage 18823 1726855029.80244: done checking for max_fail_percentage 18823 1726855029.80245: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.80246: done checking to see if all hosts have failed 18823 1726855029.80246: getting the remaining hosts for this loop 18823 1726855029.80247: done getting the remaining hosts for this loop 18823 1726855029.80250: getting the next task for host managed_node2 18823 1726855029.80253: done getting next task for host managed_node2 18823 1726855029.80254: ^ task is: None 18823 1726855029.80256: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.80257: done queuing things up, now waiting for results queue to drain 18823 1726855029.80257: results queue empty 18823 1726855029.80258: checking for any_errors_fatal 18823 1726855029.80259: done checking for any_errors_fatal 18823 1726855029.80259: checking for max_fail_percentage 18823 1726855029.80260: done checking for max_fail_percentage 18823 1726855029.80261: checking to see if all hosts have failed and the running result is not ok 18823 1726855029.80261: done checking to see if all hosts have failed 18823 1726855029.80262: getting the next task for host managed_node2 18823 1726855029.80265: done getting next task for host managed_node2 18823 1726855029.80265: ^ task is: None 18823 1726855029.80266: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.80319: in VariableManager get_vars() 18823 1726855029.80336: done with get_vars() 18823 1726855029.80343: in VariableManager get_vars() 18823 1726855029.80354: done with get_vars() 18823 1726855029.80359: variable 'omit' from source: magic vars 18823 1726855029.80394: in VariableManager get_vars() 18823 1726855029.80404: done with get_vars() 18823 1726855029.80426: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 18823 1726855029.80607: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855029.80631: getting the remaining hosts for this loop 18823 1726855029.80632: done getting the remaining hosts for this loop 18823 1726855029.80635: getting the next task for host managed_node2 18823 1726855029.80638: done getting next task for host managed_node2 18823 1726855029.80640: ^ task is: TASK: Gathering Facts 18823 1726855029.80641: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855029.80643: getting variables 18823 1726855029.80644: in VariableManager get_vars() 18823 1726855029.80652: Calling all_inventory to load vars for managed_node2 18823 1726855029.80655: Calling groups_inventory to load vars for managed_node2 18823 1726855029.80657: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855029.80662: Calling all_plugins_play to load vars for managed_node2 18823 1726855029.80664: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855029.80667: Calling groups_plugins_play to load vars for managed_node2 18823 1726855029.81939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855029.84462: done with get_vars() 18823 1726855029.84486: done getting variables 18823 1726855029.84534: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 13:57:09 -0400 (0:00:00.185) 0:00:21.497 ****** 18823 1726855029.84561: entering _queue_task() for managed_node2/gather_facts 18823 1726855029.85297: worker is 1 (out of 1 available) 18823 1726855029.85308: exiting _queue_task() for managed_node2/gather_facts 18823 1726855029.85318: done queuing things up, now waiting for results queue to drain 18823 1726855029.85318: waiting for pending results... 18823 1726855029.86106: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855029.86112: in run() - task 0affcc66-ac2b-d391-077c-000000000316 18823 1726855029.86116: variable 'ansible_search_path' from source: unknown 18823 1726855029.86143: calling self._execute() 18823 1726855029.86355: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.86365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.86378: variable 'omit' from source: magic vars 18823 1726855029.86900: variable 'ansible_distribution_major_version' from source: facts 18823 1726855029.86918: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855029.86929: variable 'omit' from source: magic vars 18823 1726855029.86967: variable 'omit' from source: magic vars 18823 1726855029.87010: variable 'omit' from source: magic vars 18823 1726855029.87054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855029.87103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855029.87133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855029.87157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855029.87179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855029.87221: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855029.87228: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.87234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.87334: Set connection var ansible_timeout to 10 18823 1726855029.87348: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855029.87356: Set connection var ansible_shell_type to sh 18823 1726855029.87368: Set connection var ansible_shell_executable to /bin/sh 18823 1726855029.87380: Set connection var ansible_connection to ssh 18823 1726855029.87394: Set connection var ansible_pipelining to False 18823 1726855029.87428: variable 'ansible_shell_executable' from source: unknown 18823 1726855029.87436: variable 'ansible_connection' from source: unknown 18823 1726855029.87443: variable 'ansible_module_compression' from source: unknown 18823 1726855029.87448: variable 'ansible_shell_type' from source: unknown 18823 1726855029.87453: variable 'ansible_shell_executable' from source: unknown 18823 1726855029.87458: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855029.87463: variable 'ansible_pipelining' from source: unknown 18823 1726855029.87468: variable 'ansible_timeout' from source: unknown 18823 1726855029.87474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855029.87651: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855029.87666: variable 'omit' from source: magic vars 18823 1726855029.87674: starting attempt loop 18823 1726855029.87679: running the handler 18823 1726855029.87700: variable 'ansible_facts' from source: unknown 18823 1726855029.87731: _low_level_execute_command(): starting 18823 1726855029.87747: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855029.88510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855029.88600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855029.88617: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855029.88632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855029.88652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855029.88791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855029.90512: stdout chunk (state=3): >>>/root <<< 18823 1726855029.90782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855029.90785: stdout chunk (state=3): >>><<< 18823 1726855029.90790: stderr chunk (state=3): >>><<< 18823 1726855029.90795: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855029.90797: _low_level_execute_command(): starting 18823 1726855029.90809: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847 `" && echo ansible-tmp-1726855029.9076276-19879-266394186662847="` echo /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847 `" ) && sleep 0' 18823 1726855029.92056: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855029.92126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.92130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855029.92139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855029.92141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.92271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855029.92377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855029.94322: stdout chunk (state=3): >>>ansible-tmp-1726855029.9076276-19879-266394186662847=/root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847 <<< 18823 1726855029.94440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855029.94895: stderr chunk (state=3): >>><<< 18823 1726855029.94898: stdout chunk (state=3): >>><<< 18823 1726855029.94902: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855029.9076276-19879-266394186662847=/root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855029.94905: variable 'ansible_module_compression' from source: unknown 18823 1726855029.94907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855029.94909: variable 'ansible_facts' from source: unknown 18823 1726855029.95355: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/AnsiballZ_setup.py 18823 1726855029.95671: Sending initial data 18823 1726855029.95681: Sent initial data (154 bytes) 18823 1726855029.96885: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855029.96891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.96895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855029.96897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855029.97205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855029.97209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855029.98912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855029.99022: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpy5m4rm4c /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/AnsiballZ_setup.py <<< 18823 1726855029.99035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/AnsiballZ_setup.py" <<< 18823 1726855029.99095: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpy5m4rm4c" to remote "/root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/AnsiballZ_setup.py" <<< 18823 1726855029.99134: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/AnsiballZ_setup.py" <<< 18823 1726855030.01905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855030.01983: stderr chunk (state=3): >>><<< 18823 1726855030.02196: stdout chunk (state=3): >>><<< 18823 1726855030.02200: done transferring module to remote 18823 1726855030.02202: _low_level_execute_command(): starting 18823 1726855030.02208: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/ /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/AnsiballZ_setup.py && sleep 0' 18823 1726855030.03275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855030.03700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855030.03724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855030.03883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855030.05676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855030.05739: stderr chunk (state=3): >>><<< 18823 1726855030.05761: stdout chunk (state=3): >>><<< 18823 1726855030.05878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855030.05890: _low_level_execute_command(): starting 18823 1726855030.05906: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/AnsiballZ_setup.py && sleep 0' 18823 1726855030.07367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855030.07381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855030.07394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855030.07447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855030.07458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855030.07475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855030.07669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855030.73589: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "10", "epoch": "1726855030", "epoch_int": "1726855030", "date": "2024-09-20", "time": "13:57:10", "iso8601_micro": "2024-09-20T17:57:10.346627Z", "iso8601": "2024-09-20T17:57:10Z", "iso8601_basic": "20240920T135710346627", "iso8601_basic_short": "20240920T135710", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.62060546875, "5m": 0.42724609375, "15m": 0.21728515625}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 813, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795004416, "block_size": 4096, "block_total": 65519099, "block_available": 63914796, "block_used": 1604303, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855030.75551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855030.75556: stdout chunk (state=3): >>><<< 18823 1726855030.75558: stderr chunk (state=3): >>><<< 18823 1726855030.75747: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "10", "epoch": "1726855030", "epoch_int": "1726855030", "date": "2024-09-20", "time": "13:57:10", "iso8601_micro": "2024-09-20T17:57:10.346627Z", "iso8601": "2024-09-20T17:57:10Z", "iso8601_basic": "20240920T135710346627", "iso8601_basic_short": "20240920T135710", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.62060546875, "5m": 0.42724609375, "15m": 0.21728515625}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 813, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795004416, "block_size": 4096, "block_total": 65519099, "block_available": 63914796, "block_used": 1604303, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855030.76670: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855030.76734: _low_level_execute_command(): starting 18823 1726855030.76801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855029.9076276-19879-266394186662847/ > /dev/null 2>&1 && sleep 0' 18823 1726855030.78143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855030.78245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855030.78481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855030.78506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855030.78679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855030.80594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855030.80598: stdout chunk (state=3): >>><<< 18823 1726855030.80614: stderr chunk (state=3): >>><<< 18823 1726855030.80637: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855030.80653: handler run complete 18823 1726855030.81193: variable 'ansible_facts' from source: unknown 18823 1726855030.81198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855030.81900: variable 'ansible_facts' from source: unknown 18823 1726855030.82226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855030.82639: attempt loop complete, returning result 18823 1726855030.83093: _execute() done 18823 1726855030.83097: dumping result to json 18823 1726855030.83099: done dumping result, returning 18823 1726855030.83101: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-000000000316] 18823 1726855030.83106: sending task result for task 0affcc66-ac2b-d391-077c-000000000316 18823 1726855030.83994: done sending task result for task 0affcc66-ac2b-d391-077c-000000000316 18823 1726855030.83998: WORKER PROCESS EXITING ok: [managed_node2] 18823 1726855030.84807: no more pending results, returning what we have 18823 1726855030.84811: results queue empty 18823 1726855030.84812: checking for any_errors_fatal 18823 1726855030.84813: done checking for any_errors_fatal 18823 1726855030.84814: checking for max_fail_percentage 18823 1726855030.84816: done checking for max_fail_percentage 18823 1726855030.84817: checking to see if all hosts have failed and the running result is not ok 18823 1726855030.84818: done checking to see if all hosts have failed 18823 1726855030.84818: getting the remaining hosts for this loop 18823 1726855030.84820: done getting the remaining hosts for this loop 18823 1726855030.84823: getting the next task for host managed_node2 18823 1726855030.84828: done getting next task for host managed_node2 18823 1726855030.84829: ^ task is: TASK: meta (flush_handlers) 18823 1726855030.84831: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855030.84836: getting variables 18823 1726855030.84837: in VariableManager get_vars() 18823 1726855030.84859: Calling all_inventory to load vars for managed_node2 18823 1726855030.84862: Calling groups_inventory to load vars for managed_node2 18823 1726855030.84865: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855030.84875: Calling all_plugins_play to load vars for managed_node2 18823 1726855030.84878: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855030.84882: Calling groups_plugins_play to load vars for managed_node2 18823 1726855030.88374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855030.92103: done with get_vars() 18823 1726855030.92139: done getting variables 18823 1726855030.92210: in VariableManager get_vars() 18823 1726855030.92221: Calling all_inventory to load vars for managed_node2 18823 1726855030.92223: Calling groups_inventory to load vars for managed_node2 18823 1726855030.92225: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855030.92347: Calling all_plugins_play to load vars for managed_node2 18823 1726855030.92352: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855030.92356: Calling groups_plugins_play to load vars for managed_node2 18823 1726855030.94790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855030.99008: done with get_vars() 18823 1726855030.99044: done queuing things up, now waiting for results queue to drain 18823 1726855030.99046: results queue empty 18823 1726855030.99047: checking for any_errors_fatal 18823 1726855030.99052: done checking for any_errors_fatal 18823 1726855030.99052: checking for max_fail_percentage 18823 1726855030.99053: done checking for max_fail_percentage 18823 1726855030.99054: checking to see if all hosts have failed and the running result is not ok 18823 1726855030.99060: done checking to see if all hosts have failed 18823 1726855030.99060: getting the remaining hosts for this loop 18823 1726855030.99061: done getting the remaining hosts for this loop 18823 1726855030.99064: getting the next task for host managed_node2 18823 1726855030.99068: done getting next task for host managed_node2 18823 1726855030.99070: ^ task is: TASK: Show network_provider 18823 1726855030.99071: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855030.99074: getting variables 18823 1726855030.99075: in VariableManager get_vars() 18823 1726855030.99084: Calling all_inventory to load vars for managed_node2 18823 1726855030.99086: Calling groups_inventory to load vars for managed_node2 18823 1726855030.99293: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855030.99300: Calling all_plugins_play to load vars for managed_node2 18823 1726855030.99305: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855030.99309: Calling groups_plugins_play to load vars for managed_node2 18823 1726855031.01807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855031.05045: done with get_vars() 18823 1726855031.05074: done getting variables 18823 1726855031.05125: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 13:57:11 -0400 (0:00:01.205) 0:00:22.703 ****** 18823 1726855031.05153: entering _queue_task() for managed_node2/debug 18823 1726855031.05922: worker is 1 (out of 1 available) 18823 1726855031.05932: exiting _queue_task() for managed_node2/debug 18823 1726855031.05945: done queuing things up, now waiting for results queue to drain 18823 1726855031.05947: waiting for pending results... 18823 1726855031.06754: running TaskExecutor() for managed_node2/TASK: Show network_provider 18823 1726855031.07139: in run() - task 0affcc66-ac2b-d391-077c-000000000033 18823 1726855031.07143: variable 'ansible_search_path' from source: unknown 18823 1726855031.07146: calling self._execute() 18823 1726855031.07395: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855031.07399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855031.07402: variable 'omit' from source: magic vars 18823 1726855031.08573: variable 'ansible_distribution_major_version' from source: facts 18823 1726855031.08607: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855031.08703: variable 'omit' from source: magic vars 18823 1726855031.08858: variable 'omit' from source: magic vars 18823 1726855031.08904: variable 'omit' from source: magic vars 18823 1726855031.09001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855031.09212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855031.09239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855031.09302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855031.09602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855031.09605: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855031.09608: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855031.09611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855031.09876: Set connection var ansible_timeout to 10 18823 1726855031.09892: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855031.09902: Set connection var ansible_shell_type to sh 18823 1726855031.09913: Set connection var ansible_shell_executable to /bin/sh 18823 1726855031.09934: Set connection var ansible_connection to ssh 18823 1726855031.10192: Set connection var ansible_pipelining to False 18823 1726855031.10196: variable 'ansible_shell_executable' from source: unknown 18823 1726855031.10198: variable 'ansible_connection' from source: unknown 18823 1726855031.10200: variable 'ansible_module_compression' from source: unknown 18823 1726855031.10202: variable 'ansible_shell_type' from source: unknown 18823 1726855031.10204: variable 'ansible_shell_executable' from source: unknown 18823 1726855031.10206: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855031.10208: variable 'ansible_pipelining' from source: unknown 18823 1726855031.10209: variable 'ansible_timeout' from source: unknown 18823 1726855031.10211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855031.10454: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855031.10701: variable 'omit' from source: magic vars 18823 1726855031.10712: starting attempt loop 18823 1726855031.10719: running the handler 18823 1726855031.10795: variable 'network_provider' from source: set_fact 18823 1726855031.11095: variable 'network_provider' from source: set_fact 18823 1726855031.11110: handler run complete 18823 1726855031.11392: attempt loop complete, returning result 18823 1726855031.11396: _execute() done 18823 1726855031.11398: dumping result to json 18823 1726855031.11400: done dumping result, returning 18823 1726855031.11403: done running TaskExecutor() for managed_node2/TASK: Show network_provider [0affcc66-ac2b-d391-077c-000000000033] 18823 1726855031.11405: sending task result for task 0affcc66-ac2b-d391-077c-000000000033 18823 1726855031.11663: done sending task result for task 0affcc66-ac2b-d391-077c-000000000033 18823 1726855031.11667: WORKER PROCESS EXITING ok: [managed_node2] => { "network_provider": "nm" } 18823 1726855031.11719: no more pending results, returning what we have 18823 1726855031.11722: results queue empty 18823 1726855031.11724: checking for any_errors_fatal 18823 1726855031.11727: done checking for any_errors_fatal 18823 1726855031.11728: checking for max_fail_percentage 18823 1726855031.11729: done checking for max_fail_percentage 18823 1726855031.11730: checking to see if all hosts have failed and the running result is not ok 18823 1726855031.11731: done checking to see if all hosts have failed 18823 1726855031.11732: getting the remaining hosts for this loop 18823 1726855031.11733: done getting the remaining hosts for this loop 18823 1726855031.11737: getting the next task for host managed_node2 18823 1726855031.11745: done getting next task for host managed_node2 18823 1726855031.11747: ^ task is: TASK: meta (flush_handlers) 18823 1726855031.11749: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855031.11754: getting variables 18823 1726855031.11756: in VariableManager get_vars() 18823 1726855031.12011: Calling all_inventory to load vars for managed_node2 18823 1726855031.12014: Calling groups_inventory to load vars for managed_node2 18823 1726855031.12018: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855031.12030: Calling all_plugins_play to load vars for managed_node2 18823 1726855031.12034: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855031.12037: Calling groups_plugins_play to load vars for managed_node2 18823 1726855031.15102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855031.19565: done with get_vars() 18823 1726855031.19595: done getting variables 18823 1726855031.19775: in VariableManager get_vars() 18823 1726855031.19786: Calling all_inventory to load vars for managed_node2 18823 1726855031.19789: Calling groups_inventory to load vars for managed_node2 18823 1726855031.19792: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855031.19797: Calling all_plugins_play to load vars for managed_node2 18823 1726855031.19799: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855031.19802: Calling groups_plugins_play to load vars for managed_node2 18823 1726855031.21914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855031.24350: done with get_vars() 18823 1726855031.24385: done queuing things up, now waiting for results queue to drain 18823 1726855031.24389: results queue empty 18823 1726855031.24390: checking for any_errors_fatal 18823 1726855031.24393: done checking for any_errors_fatal 18823 1726855031.24394: checking for max_fail_percentage 18823 1726855031.24395: done checking for max_fail_percentage 18823 1726855031.24396: checking to see if all hosts have failed and the running result is not ok 18823 1726855031.24396: done checking to see if all hosts have failed 18823 1726855031.24397: getting the remaining hosts for this loop 18823 1726855031.24398: done getting the remaining hosts for this loop 18823 1726855031.24401: getting the next task for host managed_node2 18823 1726855031.24413: done getting next task for host managed_node2 18823 1726855031.24414: ^ task is: TASK: meta (flush_handlers) 18823 1726855031.24416: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855031.24419: getting variables 18823 1726855031.24420: in VariableManager get_vars() 18823 1726855031.24430: Calling all_inventory to load vars for managed_node2 18823 1726855031.24432: Calling groups_inventory to load vars for managed_node2 18823 1726855031.24435: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855031.24440: Calling all_plugins_play to load vars for managed_node2 18823 1726855031.24442: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855031.24445: Calling groups_plugins_play to load vars for managed_node2 18823 1726855031.26143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855031.27730: done with get_vars() 18823 1726855031.27757: done getting variables 18823 1726855031.27814: in VariableManager get_vars() 18823 1726855031.27825: Calling all_inventory to load vars for managed_node2 18823 1726855031.27827: Calling groups_inventory to load vars for managed_node2 18823 1726855031.27829: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855031.27835: Calling all_plugins_play to load vars for managed_node2 18823 1726855031.27837: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855031.27840: Calling groups_plugins_play to load vars for managed_node2 18823 1726855031.29072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855031.32378: done with get_vars() 18823 1726855031.32415: done queuing things up, now waiting for results queue to drain 18823 1726855031.32417: results queue empty 18823 1726855031.32418: checking for any_errors_fatal 18823 1726855031.32420: done checking for any_errors_fatal 18823 1726855031.32420: checking for max_fail_percentage 18823 1726855031.32422: done checking for max_fail_percentage 18823 1726855031.32422: checking to see if all hosts have failed and the running result is not ok 18823 1726855031.32423: done checking to see if all hosts have failed 18823 1726855031.32424: getting the remaining hosts for this loop 18823 1726855031.32425: done getting the remaining hosts for this loop 18823 1726855031.32428: getting the next task for host managed_node2 18823 1726855031.32431: done getting next task for host managed_node2 18823 1726855031.32432: ^ task is: None 18823 1726855031.32433: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855031.32434: done queuing things up, now waiting for results queue to drain 18823 1726855031.32435: results queue empty 18823 1726855031.32436: checking for any_errors_fatal 18823 1726855031.32437: done checking for any_errors_fatal 18823 1726855031.32437: checking for max_fail_percentage 18823 1726855031.32438: done checking for max_fail_percentage 18823 1726855031.32439: checking to see if all hosts have failed and the running result is not ok 18823 1726855031.32439: done checking to see if all hosts have failed 18823 1726855031.32441: getting the next task for host managed_node2 18823 1726855031.32443: done getting next task for host managed_node2 18823 1726855031.32443: ^ task is: None 18823 1726855031.32444: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855031.32484: in VariableManager get_vars() 18823 1726855031.32508: done with get_vars() 18823 1726855031.32514: in VariableManager get_vars() 18823 1726855031.32527: done with get_vars() 18823 1726855031.32531: variable 'omit' from source: magic vars 18823 1726855031.32654: variable 'profile' from source: play vars 18823 1726855031.32774: in VariableManager get_vars() 18823 1726855031.32791: done with get_vars() 18823 1726855031.32816: variable 'omit' from source: magic vars 18823 1726855031.32878: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 18823 1726855031.33533: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855031.33560: getting the remaining hosts for this loop 18823 1726855031.33562: done getting the remaining hosts for this loop 18823 1726855031.33564: getting the next task for host managed_node2 18823 1726855031.33568: done getting next task for host managed_node2 18823 1726855031.33570: ^ task is: TASK: Gathering Facts 18823 1726855031.33571: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855031.33573: getting variables 18823 1726855031.33574: in VariableManager get_vars() 18823 1726855031.33635: Calling all_inventory to load vars for managed_node2 18823 1726855031.33638: Calling groups_inventory to load vars for managed_node2 18823 1726855031.33640: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855031.33646: Calling all_plugins_play to load vars for managed_node2 18823 1726855031.33649: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855031.33651: Calling groups_plugins_play to load vars for managed_node2 18823 1726855031.36332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855031.39129: done with get_vars() 18823 1726855031.39159: done getting variables 18823 1726855031.39210: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 13:57:11 -0400 (0:00:00.340) 0:00:23.044 ****** 18823 1726855031.39242: entering _queue_task() for managed_node2/gather_facts 18823 1726855031.39699: worker is 1 (out of 1 available) 18823 1726855031.39710: exiting _queue_task() for managed_node2/gather_facts 18823 1726855031.39720: done queuing things up, now waiting for results queue to drain 18823 1726855031.39721: waiting for pending results... 18823 1726855031.40002: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855031.40018: in run() - task 0affcc66-ac2b-d391-077c-00000000032b 18823 1726855031.40038: variable 'ansible_search_path' from source: unknown 18823 1726855031.40075: calling self._execute() 18823 1726855031.40273: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855031.40311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855031.40547: variable 'omit' from source: magic vars 18823 1726855031.41229: variable 'ansible_distribution_major_version' from source: facts 18823 1726855031.41247: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855031.41259: variable 'omit' from source: magic vars 18823 1726855031.41295: variable 'omit' from source: magic vars 18823 1726855031.41439: variable 'omit' from source: magic vars 18823 1726855031.41486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855031.41573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855031.41864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855031.41867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855031.41870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855031.41872: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855031.41874: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855031.41876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855031.42053: Set connection var ansible_timeout to 10 18823 1726855031.42118: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855031.42131: Set connection var ansible_shell_type to sh 18823 1726855031.42163: Set connection var ansible_shell_executable to /bin/sh 18823 1726855031.42173: Set connection var ansible_connection to ssh 18823 1726855031.42181: Set connection var ansible_pipelining to False 18823 1726855031.42269: variable 'ansible_shell_executable' from source: unknown 18823 1726855031.42277: variable 'ansible_connection' from source: unknown 18823 1726855031.42284: variable 'ansible_module_compression' from source: unknown 18823 1726855031.42296: variable 'ansible_shell_type' from source: unknown 18823 1726855031.42308: variable 'ansible_shell_executable' from source: unknown 18823 1726855031.42316: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855031.42327: variable 'ansible_pipelining' from source: unknown 18823 1726855031.42338: variable 'ansible_timeout' from source: unknown 18823 1726855031.42346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855031.42560: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855031.42576: variable 'omit' from source: magic vars 18823 1726855031.42633: starting attempt loop 18823 1726855031.42637: running the handler 18823 1726855031.42639: variable 'ansible_facts' from source: unknown 18823 1726855031.42642: _low_level_execute_command(): starting 18823 1726855031.42650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855031.43432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855031.43513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855031.43557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855031.43581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855031.43633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855031.43748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855031.45503: stdout chunk (state=3): >>>/root <<< 18823 1726855031.45728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855031.45732: stdout chunk (state=3): >>><<< 18823 1726855031.45734: stderr chunk (state=3): >>><<< 18823 1726855031.46021: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855031.46025: _low_level_execute_command(): starting 18823 1726855031.46028: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783 `" && echo ansible-tmp-1726855031.4580371-19934-160837773018783="` echo /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783 `" ) && sleep 0' 18823 1726855031.47194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855031.47410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855031.47512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855031.49440: stdout chunk (state=3): >>>ansible-tmp-1726855031.4580371-19934-160837773018783=/root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783 <<< 18823 1726855031.49617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855031.49621: stdout chunk (state=3): >>><<< 18823 1726855031.49623: stderr chunk (state=3): >>><<< 18823 1726855031.49995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855031.4580371-19934-160837773018783=/root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855031.49999: variable 'ansible_module_compression' from source: unknown 18823 1726855031.50002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855031.50023: variable 'ansible_facts' from source: unknown 18823 1726855031.50398: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/AnsiballZ_setup.py 18823 1726855031.50768: Sending initial data 18823 1726855031.50907: Sent initial data (154 bytes) 18823 1726855031.52210: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855031.52312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855031.53804: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855031.53871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855031.53945: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp866tg0nk /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/AnsiballZ_setup.py <<< 18823 1726855031.53961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/AnsiballZ_setup.py" <<< 18823 1726855031.54046: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp866tg0nk" to remote "/root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/AnsiballZ_setup.py" <<< 18823 1726855031.56545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855031.56560: stdout chunk (state=3): >>><<< 18823 1726855031.56581: stderr chunk (state=3): >>><<< 18823 1726855031.56640: done transferring module to remote 18823 1726855031.56709: _low_level_execute_command(): starting 18823 1726855031.56793: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/ /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/AnsiballZ_setup.py && sleep 0' 18823 1726855031.57649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855031.57663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855031.57676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855031.57697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855031.57799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855031.57816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855031.57961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855031.59880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855031.59901: stdout chunk (state=3): >>><<< 18823 1726855031.60211: stderr chunk (state=3): >>><<< 18823 1726855031.60215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855031.60226: _low_level_execute_command(): starting 18823 1726855031.60229: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/AnsiballZ_setup.py && sleep 0' 18823 1726855031.61508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855031.61524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855031.61548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855031.61659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855032.25578: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.62060546875, "5m": 0.42724609375, "15m": 0.21728515625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 815, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794451456, "block_size": 4096, "block_total": 65519099, "block_available": 63914661, "block_used": 1604438, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZi<<< 18823 1726855032.25631: stdout chunk (state=3): >>>B2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "12", "epoch": "1726855032", "epoch_int": "1726855032", "date": "2024-09-20", "time": "13:57:12", "iso8601_micro": "2024-09-20T17:57:12.194039Z", "iso8601": "2024-09-20T17:57:12Z", "iso8601_basic": "20240920T135712194039", "iso8601_basic_short": "20240920T135712", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0", "lsr27", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855032.27597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855032.27611: stdout chunk (state=3): >>><<< 18823 1726855032.27637: stderr chunk (state=3): >>><<< 18823 1726855032.27685: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.62060546875, "5m": 0.42724609375, "15m": 0.21728515625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 815, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794451456, "block_size": 4096, "block_total": 65519099, "block_available": 63914661, "block_used": 1604438, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "12", "epoch": "1726855032", "epoch_int": "1726855032", "date": "2024-09-20", "time": "13:57:12", "iso8601_micro": "2024-09-20T17:57:12.194039Z", "iso8601": "2024-09-20T17:57:12Z", "iso8601_basic": "20240920T135712194039", "iso8601_basic_short": "20240920T135712", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0", "lsr27", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855032.28206: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855032.28234: _low_level_execute_command(): starting 18823 1726855032.28245: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855031.4580371-19934-160837773018783/ > /dev/null 2>&1 && sleep 0' 18823 1726855032.28909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855032.28924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855032.28961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855032.29068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855032.29119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855032.29191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855032.31029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855032.31051: stderr chunk (state=3): >>><<< 18823 1726855032.31057: stdout chunk (state=3): >>><<< 18823 1726855032.31073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855032.31093: handler run complete 18823 1726855032.31177: variable 'ansible_facts' from source: unknown 18823 1726855032.31251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.31457: variable 'ansible_facts' from source: unknown 18823 1726855032.31533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.31622: attempt loop complete, returning result 18823 1726855032.31625: _execute() done 18823 1726855032.31628: dumping result to json 18823 1726855032.31652: done dumping result, returning 18823 1726855032.31660: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-00000000032b] 18823 1726855032.31663: sending task result for task 0affcc66-ac2b-d391-077c-00000000032b 18823 1726855032.31985: done sending task result for task 0affcc66-ac2b-d391-077c-00000000032b ok: [managed_node2] 18823 1726855032.32247: no more pending results, returning what we have 18823 1726855032.32249: results queue empty 18823 1726855032.32250: checking for any_errors_fatal 18823 1726855032.32251: done checking for any_errors_fatal 18823 1726855032.32251: checking for max_fail_percentage 18823 1726855032.32252: done checking for max_fail_percentage 18823 1726855032.32253: checking to see if all hosts have failed and the running result is not ok 18823 1726855032.32253: done checking to see if all hosts have failed 18823 1726855032.32254: getting the remaining hosts for this loop 18823 1726855032.32255: done getting the remaining hosts for this loop 18823 1726855032.32257: getting the next task for host managed_node2 18823 1726855032.32260: done getting next task for host managed_node2 18823 1726855032.32261: ^ task is: TASK: meta (flush_handlers) 18823 1726855032.32263: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855032.32265: getting variables 18823 1726855032.32266: in VariableManager get_vars() 18823 1726855032.32289: Calling all_inventory to load vars for managed_node2 18823 1726855032.32291: Calling groups_inventory to load vars for managed_node2 18823 1726855032.32293: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.32302: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.32306: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.32308: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.32831: WORKER PROCESS EXITING 18823 1726855032.33653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.34534: done with get_vars() 18823 1726855032.34550: done getting variables 18823 1726855032.34604: in VariableManager get_vars() 18823 1726855032.34613: Calling all_inventory to load vars for managed_node2 18823 1726855032.34615: Calling groups_inventory to load vars for managed_node2 18823 1726855032.34616: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.34619: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.34621: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.34622: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.35267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.36201: done with get_vars() 18823 1726855032.36221: done queuing things up, now waiting for results queue to drain 18823 1726855032.36223: results queue empty 18823 1726855032.36223: checking for any_errors_fatal 18823 1726855032.36226: done checking for any_errors_fatal 18823 1726855032.36226: checking for max_fail_percentage 18823 1726855032.36227: done checking for max_fail_percentage 18823 1726855032.36227: checking to see if all hosts have failed and the running result is not ok 18823 1726855032.36231: done checking to see if all hosts have failed 18823 1726855032.36232: getting the remaining hosts for this loop 18823 1726855032.36232: done getting the remaining hosts for this loop 18823 1726855032.36235: getting the next task for host managed_node2 18823 1726855032.36237: done getting next task for host managed_node2 18823 1726855032.36239: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18823 1726855032.36240: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855032.36247: getting variables 18823 1726855032.36248: in VariableManager get_vars() 18823 1726855032.36260: Calling all_inventory to load vars for managed_node2 18823 1726855032.36262: Calling groups_inventory to load vars for managed_node2 18823 1726855032.36263: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.36266: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.36268: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.36269: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.37405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.38913: done with get_vars() 18823 1726855032.38936: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:57:12 -0400 (0:00:00.997) 0:00:24.041 ****** 18823 1726855032.39021: entering _queue_task() for managed_node2/include_tasks 18823 1726855032.39366: worker is 1 (out of 1 available) 18823 1726855032.39379: exiting _queue_task() for managed_node2/include_tasks 18823 1726855032.39396: done queuing things up, now waiting for results queue to drain 18823 1726855032.39397: waiting for pending results... 18823 1726855032.39660: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18823 1726855032.39733: in run() - task 0affcc66-ac2b-d391-077c-00000000003c 18823 1726855032.39741: variable 'ansible_search_path' from source: unknown 18823 1726855032.39744: variable 'ansible_search_path' from source: unknown 18823 1726855032.39775: calling self._execute() 18823 1726855032.39845: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855032.39849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855032.39860: variable 'omit' from source: magic vars 18823 1726855032.40132: variable 'ansible_distribution_major_version' from source: facts 18823 1726855032.40141: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855032.40148: _execute() done 18823 1726855032.40151: dumping result to json 18823 1726855032.40154: done dumping result, returning 18823 1726855032.40167: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-d391-077c-00000000003c] 18823 1726855032.40169: sending task result for task 0affcc66-ac2b-d391-077c-00000000003c 18823 1726855032.40253: done sending task result for task 0affcc66-ac2b-d391-077c-00000000003c 18823 1726855032.40256: WORKER PROCESS EXITING 18823 1726855032.40312: no more pending results, returning what we have 18823 1726855032.40317: in VariableManager get_vars() 18823 1726855032.40354: Calling all_inventory to load vars for managed_node2 18823 1726855032.40357: Calling groups_inventory to load vars for managed_node2 18823 1726855032.40359: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.40374: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.40378: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.40381: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.41236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.43786: done with get_vars() 18823 1726855032.43809: variable 'ansible_search_path' from source: unknown 18823 1726855032.43810: variable 'ansible_search_path' from source: unknown 18823 1726855032.43839: we have included files to process 18823 1726855032.43840: generating all_blocks data 18823 1726855032.43841: done generating all_blocks data 18823 1726855032.43842: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855032.43843: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855032.43845: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855032.44773: done processing included file 18823 1726855032.44775: iterating over new_blocks loaded from include file 18823 1726855032.44777: in VariableManager get_vars() 18823 1726855032.44800: done with get_vars() 18823 1726855032.44802: filtering new block on tags 18823 1726855032.44817: done filtering new block on tags 18823 1726855032.44820: in VariableManager get_vars() 18823 1726855032.44838: done with get_vars() 18823 1726855032.44840: filtering new block on tags 18823 1726855032.44856: done filtering new block on tags 18823 1726855032.44858: in VariableManager get_vars() 18823 1726855032.44876: done with get_vars() 18823 1726855032.44878: filtering new block on tags 18823 1726855032.44900: done filtering new block on tags 18823 1726855032.44902: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 18823 1726855032.44907: extending task lists for all hosts with included blocks 18823 1726855032.45256: done extending task lists 18823 1726855032.45258: done processing included files 18823 1726855032.45259: results queue empty 18823 1726855032.45260: checking for any_errors_fatal 18823 1726855032.45261: done checking for any_errors_fatal 18823 1726855032.45262: checking for max_fail_percentage 18823 1726855032.45263: done checking for max_fail_percentage 18823 1726855032.45263: checking to see if all hosts have failed and the running result is not ok 18823 1726855032.45264: done checking to see if all hosts have failed 18823 1726855032.45265: getting the remaining hosts for this loop 18823 1726855032.45266: done getting the remaining hosts for this loop 18823 1726855032.45269: getting the next task for host managed_node2 18823 1726855032.45272: done getting next task for host managed_node2 18823 1726855032.45275: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18823 1726855032.45277: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855032.45286: getting variables 18823 1726855032.45289: in VariableManager get_vars() 18823 1726855032.45301: Calling all_inventory to load vars for managed_node2 18823 1726855032.45304: Calling groups_inventory to load vars for managed_node2 18823 1726855032.45305: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.45311: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.45313: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.45316: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.52299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.53920: done with get_vars() 18823 1726855032.53948: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:57:12 -0400 (0:00:00.150) 0:00:24.192 ****** 18823 1726855032.54045: entering _queue_task() for managed_node2/setup 18823 1726855032.54430: worker is 1 (out of 1 available) 18823 1726855032.54447: exiting _queue_task() for managed_node2/setup 18823 1726855032.54466: done queuing things up, now waiting for results queue to drain 18823 1726855032.54468: waiting for pending results... 18823 1726855032.54912: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18823 1726855032.54919: in run() - task 0affcc66-ac2b-d391-077c-00000000036c 18823 1726855032.54922: variable 'ansible_search_path' from source: unknown 18823 1726855032.54924: variable 'ansible_search_path' from source: unknown 18823 1726855032.54963: calling self._execute() 18823 1726855032.55065: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855032.55079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855032.55104: variable 'omit' from source: magic vars 18823 1726855032.55518: variable 'ansible_distribution_major_version' from source: facts 18823 1726855032.55548: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855032.55993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855032.57985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855032.58070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855032.58113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855032.58152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855032.58191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855032.58272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855032.58313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855032.58344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855032.58395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855032.58416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855032.58472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855032.58508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855032.58603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855032.58607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855032.58610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855032.58759: variable '__network_required_facts' from source: role '' defaults 18823 1726855032.58771: variable 'ansible_facts' from source: unknown 18823 1726855032.60183: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18823 1726855032.60194: when evaluation is False, skipping this task 18823 1726855032.60229: _execute() done 18823 1726855032.60239: dumping result to json 18823 1726855032.60249: done dumping result, returning 18823 1726855032.60330: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-d391-077c-00000000036c] 18823 1726855032.60333: sending task result for task 0affcc66-ac2b-d391-077c-00000000036c 18823 1726855032.60626: done sending task result for task 0affcc66-ac2b-d391-077c-00000000036c 18823 1726855032.60629: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855032.60699: no more pending results, returning what we have 18823 1726855032.60703: results queue empty 18823 1726855032.60704: checking for any_errors_fatal 18823 1726855032.60706: done checking for any_errors_fatal 18823 1726855032.60707: checking for max_fail_percentage 18823 1726855032.60708: done checking for max_fail_percentage 18823 1726855032.60709: checking to see if all hosts have failed and the running result is not ok 18823 1726855032.60710: done checking to see if all hosts have failed 18823 1726855032.60711: getting the remaining hosts for this loop 18823 1726855032.60713: done getting the remaining hosts for this loop 18823 1726855032.60717: getting the next task for host managed_node2 18823 1726855032.60727: done getting next task for host managed_node2 18823 1726855032.60731: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18823 1726855032.60734: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855032.60755: getting variables 18823 1726855032.60757: in VariableManager get_vars() 18823 1726855032.60810: Calling all_inventory to load vars for managed_node2 18823 1726855032.60814: Calling groups_inventory to load vars for managed_node2 18823 1726855032.60816: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.60827: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.60831: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.60834: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.64157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.66276: done with get_vars() 18823 1726855032.66309: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:57:12 -0400 (0:00:00.123) 0:00:24.315 ****** 18823 1726855032.66423: entering _queue_task() for managed_node2/stat 18823 1726855032.66796: worker is 1 (out of 1 available) 18823 1726855032.66808: exiting _queue_task() for managed_node2/stat 18823 1726855032.66821: done queuing things up, now waiting for results queue to drain 18823 1726855032.66822: waiting for pending results... 18823 1726855032.67211: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 18823 1726855032.67281: in run() - task 0affcc66-ac2b-d391-077c-00000000036e 18823 1726855032.67314: variable 'ansible_search_path' from source: unknown 18823 1726855032.67322: variable 'ansible_search_path' from source: unknown 18823 1726855032.67363: calling self._execute() 18823 1726855032.67466: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855032.67481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855032.67502: variable 'omit' from source: magic vars 18823 1726855032.67914: variable 'ansible_distribution_major_version' from source: facts 18823 1726855032.67938: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855032.68116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855032.68405: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855032.68453: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855032.68539: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855032.68577: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855032.68681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855032.68726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855032.68762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855032.68800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855032.68920: variable '__network_is_ostree' from source: set_fact 18823 1726855032.68937: Evaluated conditional (not __network_is_ostree is defined): False 18823 1726855032.69046: when evaluation is False, skipping this task 18823 1726855032.69050: _execute() done 18823 1726855032.69052: dumping result to json 18823 1726855032.69054: done dumping result, returning 18823 1726855032.69057: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-d391-077c-00000000036e] 18823 1726855032.69059: sending task result for task 0affcc66-ac2b-d391-077c-00000000036e 18823 1726855032.69140: done sending task result for task 0affcc66-ac2b-d391-077c-00000000036e 18823 1726855032.69147: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18823 1726855032.69345: no more pending results, returning what we have 18823 1726855032.69349: results queue empty 18823 1726855032.69350: checking for any_errors_fatal 18823 1726855032.69356: done checking for any_errors_fatal 18823 1726855032.69357: checking for max_fail_percentage 18823 1726855032.69359: done checking for max_fail_percentage 18823 1726855032.69360: checking to see if all hosts have failed and the running result is not ok 18823 1726855032.69361: done checking to see if all hosts have failed 18823 1726855032.69362: getting the remaining hosts for this loop 18823 1726855032.69364: done getting the remaining hosts for this loop 18823 1726855032.69368: getting the next task for host managed_node2 18823 1726855032.69376: done getting next task for host managed_node2 18823 1726855032.69380: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18823 1726855032.69383: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855032.69402: getting variables 18823 1726855032.69405: in VariableManager get_vars() 18823 1726855032.69449: Calling all_inventory to load vars for managed_node2 18823 1726855032.69452: Calling groups_inventory to load vars for managed_node2 18823 1726855032.69455: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.69472: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.69476: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.69479: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.71014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.72666: done with get_vars() 18823 1726855032.72695: done getting variables 18823 1726855032.72760: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:57:12 -0400 (0:00:00.063) 0:00:24.379 ****** 18823 1726855032.72803: entering _queue_task() for managed_node2/set_fact 18823 1726855032.73154: worker is 1 (out of 1 available) 18823 1726855032.73165: exiting _queue_task() for managed_node2/set_fact 18823 1726855032.73178: done queuing things up, now waiting for results queue to drain 18823 1726855032.73179: waiting for pending results... 18823 1726855032.73465: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18823 1726855032.73605: in run() - task 0affcc66-ac2b-d391-077c-00000000036f 18823 1726855032.73632: variable 'ansible_search_path' from source: unknown 18823 1726855032.73640: variable 'ansible_search_path' from source: unknown 18823 1726855032.73685: calling self._execute() 18823 1726855032.73789: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855032.73802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855032.73818: variable 'omit' from source: magic vars 18823 1726855032.74222: variable 'ansible_distribution_major_version' from source: facts 18823 1726855032.74241: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855032.74597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855032.74680: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855032.74733: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855032.74767: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855032.74854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855032.74943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855032.74980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855032.75067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855032.75178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855032.75320: variable '__network_is_ostree' from source: set_fact 18823 1726855032.75374: Evaluated conditional (not __network_is_ostree is defined): False 18823 1726855032.75382: when evaluation is False, skipping this task 18823 1726855032.75390: _execute() done 18823 1726855032.75583: dumping result to json 18823 1726855032.75586: done dumping result, returning 18823 1726855032.75591: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-d391-077c-00000000036f] 18823 1726855032.75593: sending task result for task 0affcc66-ac2b-d391-077c-00000000036f 18823 1726855032.75656: done sending task result for task 0affcc66-ac2b-d391-077c-00000000036f 18823 1726855032.75659: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18823 1726855032.75735: no more pending results, returning what we have 18823 1726855032.75739: results queue empty 18823 1726855032.75740: checking for any_errors_fatal 18823 1726855032.75745: done checking for any_errors_fatal 18823 1726855032.75746: checking for max_fail_percentage 18823 1726855032.75748: done checking for max_fail_percentage 18823 1726855032.75749: checking to see if all hosts have failed and the running result is not ok 18823 1726855032.75750: done checking to see if all hosts have failed 18823 1726855032.75750: getting the remaining hosts for this loop 18823 1726855032.75752: done getting the remaining hosts for this loop 18823 1726855032.75756: getting the next task for host managed_node2 18823 1726855032.75766: done getting next task for host managed_node2 18823 1726855032.75770: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18823 1726855032.75773: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855032.75786: getting variables 18823 1726855032.75790: in VariableManager get_vars() 18823 1726855032.75828: Calling all_inventory to load vars for managed_node2 18823 1726855032.75831: Calling groups_inventory to load vars for managed_node2 18823 1726855032.75833: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855032.75847: Calling all_plugins_play to load vars for managed_node2 18823 1726855032.75851: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855032.75855: Calling groups_plugins_play to load vars for managed_node2 18823 1726855032.79955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855032.82331: done with get_vars() 18823 1726855032.82362: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:57:12 -0400 (0:00:00.096) 0:00:24.476 ****** 18823 1726855032.82488: entering _queue_task() for managed_node2/service_facts 18823 1726855032.82869: worker is 1 (out of 1 available) 18823 1726855032.82882: exiting _queue_task() for managed_node2/service_facts 18823 1726855032.83096: done queuing things up, now waiting for results queue to drain 18823 1726855032.83097: waiting for pending results... 18823 1726855032.83327: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 18823 1726855032.83494: in run() - task 0affcc66-ac2b-d391-077c-000000000371 18823 1726855032.83515: variable 'ansible_search_path' from source: unknown 18823 1726855032.83524: variable 'ansible_search_path' from source: unknown 18823 1726855032.83564: calling self._execute() 18823 1726855032.83679: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855032.83694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855032.83712: variable 'omit' from source: magic vars 18823 1726855032.84310: variable 'ansible_distribution_major_version' from source: facts 18823 1726855032.84350: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855032.84382: variable 'omit' from source: magic vars 18823 1726855032.84453: variable 'omit' from source: magic vars 18823 1726855032.84606: variable 'omit' from source: magic vars 18823 1726855032.84609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855032.84625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855032.84653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855032.84678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855032.84706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855032.84782: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855032.84799: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855032.84825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855032.84939: Set connection var ansible_timeout to 10 18823 1726855032.84984: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855032.85011: Set connection var ansible_shell_type to sh 18823 1726855032.85024: Set connection var ansible_shell_executable to /bin/sh 18823 1726855032.85035: Set connection var ansible_connection to ssh 18823 1726855032.85044: Set connection var ansible_pipelining to False 18823 1726855032.85074: variable 'ansible_shell_executable' from source: unknown 18823 1726855032.85081: variable 'ansible_connection' from source: unknown 18823 1726855032.85089: variable 'ansible_module_compression' from source: unknown 18823 1726855032.85096: variable 'ansible_shell_type' from source: unknown 18823 1726855032.85102: variable 'ansible_shell_executable' from source: unknown 18823 1726855032.85139: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855032.85142: variable 'ansible_pipelining' from source: unknown 18823 1726855032.85145: variable 'ansible_timeout' from source: unknown 18823 1726855032.85147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855032.85338: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855032.85360: variable 'omit' from source: magic vars 18823 1726855032.85633: starting attempt loop 18823 1726855032.85638: running the handler 18823 1726855032.85641: _low_level_execute_command(): starting 18823 1726855032.85643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855032.86913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855032.86943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855032.87056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855032.87381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855032.87517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855032.89235: stdout chunk (state=3): >>>/root <<< 18823 1726855032.89373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855032.89386: stdout chunk (state=3): >>><<< 18823 1726855032.89402: stderr chunk (state=3): >>><<< 18823 1726855032.89592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855032.89596: _low_level_execute_command(): starting 18823 1726855032.89598: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806 `" && echo ansible-tmp-1726855032.8949225-19988-50300601162806="` echo /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806 `" ) && sleep 0' 18823 1726855032.90783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855032.90908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855032.91116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855032.91226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855032.91899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855032.94197: stdout chunk (state=3): >>>ansible-tmp-1726855032.8949225-19988-50300601162806=/root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806 <<< 18823 1726855032.94206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855032.94209: stdout chunk (state=3): >>><<< 18823 1726855032.94212: stderr chunk (state=3): >>><<< 18823 1726855032.94214: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855032.8949225-19988-50300601162806=/root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855032.94217: variable 'ansible_module_compression' from source: unknown 18823 1726855032.94408: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18823 1726855032.94595: variable 'ansible_facts' from source: unknown 18823 1726855032.94928: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/AnsiballZ_service_facts.py 18823 1726855032.95622: Sending initial data 18823 1726855032.95625: Sent initial data (161 bytes) 18823 1726855032.96950: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855032.96965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855032.97346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855032.97350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855032.97352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855032.97355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855032.97609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855032.99167: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855032.99234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855032.99309: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpz8mwvx0p /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/AnsiballZ_service_facts.py <<< 18823 1726855032.99326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/AnsiballZ_service_facts.py" <<< 18823 1726855032.99383: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpz8mwvx0p" to remote "/root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/AnsiballZ_service_facts.py" <<< 18823 1726855033.02094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855033.02099: stdout chunk (state=3): >>><<< 18823 1726855033.02102: stderr chunk (state=3): >>><<< 18823 1726855033.02104: done transferring module to remote 18823 1726855033.02106: _low_level_execute_command(): starting 18823 1726855033.02108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/ /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/AnsiballZ_service_facts.py && sleep 0' 18823 1726855033.03717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855033.03792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855033.03874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855033.03955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855033.04013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855033.04125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855033.05947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855033.05961: stdout chunk (state=3): >>><<< 18823 1726855033.05964: stderr chunk (state=3): >>><<< 18823 1726855033.06094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855033.06099: _low_level_execute_command(): starting 18823 1726855033.06102: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/AnsiballZ_service_facts.py && sleep 0' 18823 1726855033.07210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855033.07394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855033.07524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855033.07693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855034.58555: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 18823 1726855034.58593: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 18823 1726855034.58653: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18823 1726855034.60375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855034.60384: stdout chunk (state=3): >>><<< 18823 1726855034.60593: stderr chunk (state=3): >>><<< 18823 1726855034.60599: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855034.62015: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855034.62032: _low_level_execute_command(): starting 18823 1726855034.62191: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855032.8949225-19988-50300601162806/ > /dev/null 2>&1 && sleep 0' 18823 1726855034.62938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.63003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855034.63036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855034.64903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855034.65021: stderr chunk (state=3): >>><<< 18823 1726855034.65024: stdout chunk (state=3): >>><<< 18823 1726855034.65040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855034.65106: handler run complete 18823 1726855034.65508: variable 'ansible_facts' from source: unknown 18823 1726855034.65846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855034.67036: variable 'ansible_facts' from source: unknown 18823 1726855034.67247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855034.67571: attempt loop complete, returning result 18823 1726855034.67896: _execute() done 18823 1726855034.67900: dumping result to json 18823 1726855034.67902: done dumping result, returning 18823 1726855034.67905: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-d391-077c-000000000371] 18823 1726855034.67907: sending task result for task 0affcc66-ac2b-d391-077c-000000000371 18823 1726855034.70550: done sending task result for task 0affcc66-ac2b-d391-077c-000000000371 18823 1726855034.70554: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855034.70676: no more pending results, returning what we have 18823 1726855034.70679: results queue empty 18823 1726855034.70680: checking for any_errors_fatal 18823 1726855034.70684: done checking for any_errors_fatal 18823 1726855034.70684: checking for max_fail_percentage 18823 1726855034.70686: done checking for max_fail_percentage 18823 1726855034.70688: checking to see if all hosts have failed and the running result is not ok 18823 1726855034.70689: done checking to see if all hosts have failed 18823 1726855034.70690: getting the remaining hosts for this loop 18823 1726855034.70692: done getting the remaining hosts for this loop 18823 1726855034.70696: getting the next task for host managed_node2 18823 1726855034.70702: done getting next task for host managed_node2 18823 1726855034.70705: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18823 1726855034.70710: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855034.70721: getting variables 18823 1726855034.70722: in VariableManager get_vars() 18823 1726855034.70754: Calling all_inventory to load vars for managed_node2 18823 1726855034.70757: Calling groups_inventory to load vars for managed_node2 18823 1726855034.70759: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855034.70768: Calling all_plugins_play to load vars for managed_node2 18823 1726855034.70771: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855034.70774: Calling groups_plugins_play to load vars for managed_node2 18823 1726855034.73828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855034.76848: done with get_vars() 18823 1726855034.76889: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:57:14 -0400 (0:00:01.945) 0:00:26.421 ****** 18823 1726855034.77005: entering _queue_task() for managed_node2/package_facts 18823 1726855034.77373: worker is 1 (out of 1 available) 18823 1726855034.77534: exiting _queue_task() for managed_node2/package_facts 18823 1726855034.77546: done queuing things up, now waiting for results queue to drain 18823 1726855034.77547: waiting for pending results... 18823 1726855034.77705: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 18823 1726855034.77870: in run() - task 0affcc66-ac2b-d391-077c-000000000372 18823 1726855034.77898: variable 'ansible_search_path' from source: unknown 18823 1726855034.77906: variable 'ansible_search_path' from source: unknown 18823 1726855034.77955: calling self._execute() 18823 1726855034.78082: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855034.78086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855034.78099: variable 'omit' from source: magic vars 18823 1726855034.78517: variable 'ansible_distribution_major_version' from source: facts 18823 1726855034.78583: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855034.78589: variable 'omit' from source: magic vars 18823 1726855034.78620: variable 'omit' from source: magic vars 18823 1726855034.78666: variable 'omit' from source: magic vars 18823 1726855034.78719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855034.78769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855034.78812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855034.78825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855034.78854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855034.78949: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855034.78952: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855034.78955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855034.79027: Set connection var ansible_timeout to 10 18823 1726855034.79041: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855034.79049: Set connection var ansible_shell_type to sh 18823 1726855034.79072: Set connection var ansible_shell_executable to /bin/sh 18823 1726855034.79166: Set connection var ansible_connection to ssh 18823 1726855034.79170: Set connection var ansible_pipelining to False 18823 1726855034.79174: variable 'ansible_shell_executable' from source: unknown 18823 1726855034.79176: variable 'ansible_connection' from source: unknown 18823 1726855034.79179: variable 'ansible_module_compression' from source: unknown 18823 1726855034.79181: variable 'ansible_shell_type' from source: unknown 18823 1726855034.79183: variable 'ansible_shell_executable' from source: unknown 18823 1726855034.79185: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855034.79186: variable 'ansible_pipelining' from source: unknown 18823 1726855034.79190: variable 'ansible_timeout' from source: unknown 18823 1726855034.79192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855034.79383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855034.79390: variable 'omit' from source: magic vars 18823 1726855034.79392: starting attempt loop 18823 1726855034.79395: running the handler 18823 1726855034.79405: _low_level_execute_command(): starting 18823 1726855034.79494: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855034.80171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855034.80192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855034.80266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.80320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855034.80336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855034.80489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855034.80584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855034.82696: stdout chunk (state=3): >>>/root <<< 18823 1726855034.82700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855034.82703: stdout chunk (state=3): >>><<< 18823 1726855034.82705: stderr chunk (state=3): >>><<< 18823 1726855034.82707: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855034.82709: _low_level_execute_command(): starting 18823 1726855034.82712: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006 `" && echo ansible-tmp-1726855034.8248615-20088-47883142984006="` echo /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006 `" ) && sleep 0' 18823 1726855034.83667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855034.83706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855034.83720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855034.83734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855034.83747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855034.83759: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855034.83762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.83777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855034.83785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855034.83979: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855034.83984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855034.83986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855034.83990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855034.83992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855034.83994: stderr chunk (state=3): >>>debug2: match found <<< 18823 1726855034.83996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.83998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855034.84041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855034.85956: stdout chunk (state=3): >>>ansible-tmp-1726855034.8248615-20088-47883142984006=/root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006 <<< 18823 1726855034.86170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855034.86174: stdout chunk (state=3): >>><<< 18823 1726855034.86177: stderr chunk (state=3): >>><<< 18823 1726855034.86274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855034.8248615-20088-47883142984006=/root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855034.86326: variable 'ansible_module_compression' from source: unknown 18823 1726855034.86414: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18823 1726855034.86610: variable 'ansible_facts' from source: unknown 18823 1726855034.87020: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/AnsiballZ_package_facts.py 18823 1726855034.87334: Sending initial data 18823 1726855034.87338: Sent initial data (161 bytes) 18823 1726855034.88604: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855034.88651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.89043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855034.89047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855034.89049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855034.89051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855034.90694: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855034.90698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855034.90927: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpdx8hfrfh /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/AnsiballZ_package_facts.py <<< 18823 1726855034.90937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/AnsiballZ_package_facts.py" <<< 18823 1726855034.90999: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpdx8hfrfh" to remote "/root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/AnsiballZ_package_facts.py" <<< 18823 1726855034.94584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855034.94591: stderr chunk (state=3): >>><<< 18823 1726855034.94594: stdout chunk (state=3): >>><<< 18823 1726855034.94673: done transferring module to remote 18823 1726855034.94676: _low_level_execute_command(): starting 18823 1726855034.94679: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/ /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/AnsiballZ_package_facts.py && sleep 0' 18823 1726855034.96109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855034.96175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855034.96184: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855034.96195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.96210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855034.96217: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855034.96229: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855034.96238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855034.96248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855034.96261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855034.96336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.96448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855034.96451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855034.96454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855034.96512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855034.98332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855034.98336: stdout chunk (state=3): >>><<< 18823 1726855034.98345: stderr chunk (state=3): >>><<< 18823 1726855034.98374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855034.98378: _low_level_execute_command(): starting 18823 1726855034.98382: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/AnsiballZ_package_facts.py && sleep 0' 18823 1726855034.99008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855034.99014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855034.99026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855034.99073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855034.99086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855034.99095: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855034.99110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855034.99163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855034.99212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855034.99215: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855034.99317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855034.99321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855034.99324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855034.99428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855035.43623: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap":<<< 18823 1726855035.43734: stdout chunk (state=3): >>> [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18823 1726855035.45656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855035.45661: stdout chunk (state=3): >>><<< 18823 1726855035.45663: stderr chunk (state=3): >>><<< 18823 1726855035.45897: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855035.49999: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855035.50030: _low_level_execute_command(): starting 18823 1726855035.50040: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855034.8248615-20088-47883142984006/ > /dev/null 2>&1 && sleep 0' 18823 1726855035.50661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855035.50676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855035.50696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855035.50717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855035.50732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855035.50742: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855035.50753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855035.50841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855035.50867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855035.50972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855035.52945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855035.52955: stdout chunk (state=3): >>><<< 18823 1726855035.52965: stderr chunk (state=3): >>><<< 18823 1726855035.52982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855035.53097: handler run complete 18823 1726855035.53981: variable 'ansible_facts' from source: unknown 18823 1726855035.54470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855035.56856: variable 'ansible_facts' from source: unknown 18823 1726855035.57349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855035.58308: attempt loop complete, returning result 18823 1726855035.58329: _execute() done 18823 1726855035.58344: dumping result to json 18823 1726855035.58571: done dumping result, returning 18823 1726855035.58593: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-d391-077c-000000000372] 18823 1726855035.58602: sending task result for task 0affcc66-ac2b-d391-077c-000000000372 18823 1726855035.64142: done sending task result for task 0affcc66-ac2b-d391-077c-000000000372 18823 1726855035.64146: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855035.64251: no more pending results, returning what we have 18823 1726855035.64254: results queue empty 18823 1726855035.64255: checking for any_errors_fatal 18823 1726855035.64260: done checking for any_errors_fatal 18823 1726855035.64261: checking for max_fail_percentage 18823 1726855035.64262: done checking for max_fail_percentage 18823 1726855035.64263: checking to see if all hosts have failed and the running result is not ok 18823 1726855035.64264: done checking to see if all hosts have failed 18823 1726855035.64264: getting the remaining hosts for this loop 18823 1726855035.64265: done getting the remaining hosts for this loop 18823 1726855035.64269: getting the next task for host managed_node2 18823 1726855035.64274: done getting next task for host managed_node2 18823 1726855035.64277: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18823 1726855035.64279: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855035.64290: getting variables 18823 1726855035.64292: in VariableManager get_vars() 18823 1726855035.64322: Calling all_inventory to load vars for managed_node2 18823 1726855035.64324: Calling groups_inventory to load vars for managed_node2 18823 1726855035.64326: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855035.64335: Calling all_plugins_play to load vars for managed_node2 18823 1726855035.64337: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855035.64339: Calling groups_plugins_play to load vars for managed_node2 18823 1726855035.67506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855035.70520: done with get_vars() 18823 1726855035.70565: done getting variables 18823 1726855035.70642: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:57:15 -0400 (0:00:00.936) 0:00:27.358 ****** 18823 1726855035.70675: entering _queue_task() for managed_node2/debug 18823 1726855035.71166: worker is 1 (out of 1 available) 18823 1726855035.71179: exiting _queue_task() for managed_node2/debug 18823 1726855035.71192: done queuing things up, now waiting for results queue to drain 18823 1726855035.71193: waiting for pending results... 18823 1726855035.71510: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 18823 1726855035.71539: in run() - task 0affcc66-ac2b-d391-077c-00000000003d 18823 1726855035.71560: variable 'ansible_search_path' from source: unknown 18823 1726855035.71569: variable 'ansible_search_path' from source: unknown 18823 1726855035.71623: calling self._execute() 18823 1726855035.71748: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855035.71821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855035.71824: variable 'omit' from source: magic vars 18823 1726855035.72211: variable 'ansible_distribution_major_version' from source: facts 18823 1726855035.72243: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855035.72264: variable 'omit' from source: magic vars 18823 1726855035.72315: variable 'omit' from source: magic vars 18823 1726855035.72428: variable 'network_provider' from source: set_fact 18823 1726855035.72452: variable 'omit' from source: magic vars 18823 1726855035.72509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855035.72552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855035.72584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855035.72695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855035.72698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855035.72700: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855035.72702: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855035.72707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855035.72778: Set connection var ansible_timeout to 10 18823 1726855035.72789: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855035.72796: Set connection var ansible_shell_type to sh 18823 1726855035.72810: Set connection var ansible_shell_executable to /bin/sh 18823 1726855035.72817: Set connection var ansible_connection to ssh 18823 1726855035.72827: Set connection var ansible_pipelining to False 18823 1726855035.72853: variable 'ansible_shell_executable' from source: unknown 18823 1726855035.72912: variable 'ansible_connection' from source: unknown 18823 1726855035.72915: variable 'ansible_module_compression' from source: unknown 18823 1726855035.72918: variable 'ansible_shell_type' from source: unknown 18823 1726855035.72920: variable 'ansible_shell_executable' from source: unknown 18823 1726855035.72922: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855035.72924: variable 'ansible_pipelining' from source: unknown 18823 1726855035.72926: variable 'ansible_timeout' from source: unknown 18823 1726855035.72931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855035.73057: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855035.73073: variable 'omit' from source: magic vars 18823 1726855035.73082: starting attempt loop 18823 1726855035.73090: running the handler 18823 1726855035.73151: handler run complete 18823 1726855035.73239: attempt loop complete, returning result 18823 1726855035.73242: _execute() done 18823 1726855035.73245: dumping result to json 18823 1726855035.73247: done dumping result, returning 18823 1726855035.73249: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-d391-077c-00000000003d] 18823 1726855035.73252: sending task result for task 0affcc66-ac2b-d391-077c-00000000003d 18823 1726855035.73324: done sending task result for task 0affcc66-ac2b-d391-077c-00000000003d 18823 1726855035.73328: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 18823 1726855035.73392: no more pending results, returning what we have 18823 1726855035.73395: results queue empty 18823 1726855035.73396: checking for any_errors_fatal 18823 1726855035.73408: done checking for any_errors_fatal 18823 1726855035.73409: checking for max_fail_percentage 18823 1726855035.73411: done checking for max_fail_percentage 18823 1726855035.73412: checking to see if all hosts have failed and the running result is not ok 18823 1726855035.73412: done checking to see if all hosts have failed 18823 1726855035.73413: getting the remaining hosts for this loop 18823 1726855035.73414: done getting the remaining hosts for this loop 18823 1726855035.73418: getting the next task for host managed_node2 18823 1726855035.73425: done getting next task for host managed_node2 18823 1726855035.73428: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18823 1726855035.73430: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855035.73440: getting variables 18823 1726855035.73442: in VariableManager get_vars() 18823 1726855035.73479: Calling all_inventory to load vars for managed_node2 18823 1726855035.73481: Calling groups_inventory to load vars for managed_node2 18823 1726855035.73483: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855035.73615: Calling all_plugins_play to load vars for managed_node2 18823 1726855035.73619: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855035.73623: Calling groups_plugins_play to load vars for managed_node2 18823 1726855035.76341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855035.78214: done with get_vars() 18823 1726855035.78243: done getting variables 18823 1726855035.78310: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:57:15 -0400 (0:00:00.076) 0:00:27.435 ****** 18823 1726855035.78340: entering _queue_task() for managed_node2/fail 18823 1726855035.78941: worker is 1 (out of 1 available) 18823 1726855035.78953: exiting _queue_task() for managed_node2/fail 18823 1726855035.78965: done queuing things up, now waiting for results queue to drain 18823 1726855035.78966: waiting for pending results... 18823 1726855035.79688: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18823 1726855035.79698: in run() - task 0affcc66-ac2b-d391-077c-00000000003e 18823 1726855035.79702: variable 'ansible_search_path' from source: unknown 18823 1726855035.79704: variable 'ansible_search_path' from source: unknown 18823 1726855035.79826: calling self._execute() 18823 1726855035.79921: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855035.79933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855035.79959: variable 'omit' from source: magic vars 18823 1726855035.80361: variable 'ansible_distribution_major_version' from source: facts 18823 1726855035.80379: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855035.80507: variable 'network_state' from source: role '' defaults 18823 1726855035.80523: Evaluated conditional (network_state != {}): False 18823 1726855035.80531: when evaluation is False, skipping this task 18823 1726855035.80539: _execute() done 18823 1726855035.80556: dumping result to json 18823 1726855035.80564: done dumping result, returning 18823 1726855035.80657: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-d391-077c-00000000003e] 18823 1726855035.80661: sending task result for task 0affcc66-ac2b-d391-077c-00000000003e 18823 1726855035.80731: done sending task result for task 0affcc66-ac2b-d391-077c-00000000003e 18823 1726855035.80734: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855035.80807: no more pending results, returning what we have 18823 1726855035.80812: results queue empty 18823 1726855035.80812: checking for any_errors_fatal 18823 1726855035.80817: done checking for any_errors_fatal 18823 1726855035.80818: checking for max_fail_percentage 18823 1726855035.80820: done checking for max_fail_percentage 18823 1726855035.80821: checking to see if all hosts have failed and the running result is not ok 18823 1726855035.80821: done checking to see if all hosts have failed 18823 1726855035.80822: getting the remaining hosts for this loop 18823 1726855035.80824: done getting the remaining hosts for this loop 18823 1726855035.80828: getting the next task for host managed_node2 18823 1726855035.80834: done getting next task for host managed_node2 18823 1726855035.80838: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18823 1726855035.80841: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855035.80855: getting variables 18823 1726855035.80856: in VariableManager get_vars() 18823 1726855035.80897: Calling all_inventory to load vars for managed_node2 18823 1726855035.80901: Calling groups_inventory to load vars for managed_node2 18823 1726855035.80903: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855035.80917: Calling all_plugins_play to load vars for managed_node2 18823 1726855035.80921: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855035.80925: Calling groups_plugins_play to load vars for managed_node2 18823 1726855035.83209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855035.84939: done with get_vars() 18823 1726855035.84971: done getting variables 18823 1726855035.85038: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:57:15 -0400 (0:00:00.067) 0:00:27.502 ****** 18823 1726855035.85073: entering _queue_task() for managed_node2/fail 18823 1726855035.85616: worker is 1 (out of 1 available) 18823 1726855035.85626: exiting _queue_task() for managed_node2/fail 18823 1726855035.85635: done queuing things up, now waiting for results queue to drain 18823 1726855035.85636: waiting for pending results... 18823 1726855035.85764: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18823 1726855035.85991: in run() - task 0affcc66-ac2b-d391-077c-00000000003f 18823 1726855035.85995: variable 'ansible_search_path' from source: unknown 18823 1726855035.85998: variable 'ansible_search_path' from source: unknown 18823 1726855035.86000: calling self._execute() 18823 1726855035.86170: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855035.86404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855035.86407: variable 'omit' from source: magic vars 18823 1726855035.86928: variable 'ansible_distribution_major_version' from source: facts 18823 1726855035.86949: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855035.87085: variable 'network_state' from source: role '' defaults 18823 1726855035.87103: Evaluated conditional (network_state != {}): False 18823 1726855035.87111: when evaluation is False, skipping this task 18823 1726855035.87119: _execute() done 18823 1726855035.87126: dumping result to json 18823 1726855035.87134: done dumping result, returning 18823 1726855035.87145: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-d391-077c-00000000003f] 18823 1726855035.87155: sending task result for task 0affcc66-ac2b-d391-077c-00000000003f 18823 1726855035.87393: done sending task result for task 0affcc66-ac2b-d391-077c-00000000003f 18823 1726855035.87397: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855035.87449: no more pending results, returning what we have 18823 1726855035.87453: results queue empty 18823 1726855035.87454: checking for any_errors_fatal 18823 1726855035.87461: done checking for any_errors_fatal 18823 1726855035.87462: checking for max_fail_percentage 18823 1726855035.87465: done checking for max_fail_percentage 18823 1726855035.87466: checking to see if all hosts have failed and the running result is not ok 18823 1726855035.87467: done checking to see if all hosts have failed 18823 1726855035.87468: getting the remaining hosts for this loop 18823 1726855035.87469: done getting the remaining hosts for this loop 18823 1726855035.87473: getting the next task for host managed_node2 18823 1726855035.87481: done getting next task for host managed_node2 18823 1726855035.87485: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18823 1726855035.87489: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855035.87507: getting variables 18823 1726855035.87509: in VariableManager get_vars() 18823 1726855035.87550: Calling all_inventory to load vars for managed_node2 18823 1726855035.87553: Calling groups_inventory to load vars for managed_node2 18823 1726855035.87556: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855035.87570: Calling all_plugins_play to load vars for managed_node2 18823 1726855035.87573: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855035.87576: Calling groups_plugins_play to load vars for managed_node2 18823 1726855035.90055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855035.92447: done with get_vars() 18823 1726855035.92471: done getting variables 18823 1726855035.92537: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:57:15 -0400 (0:00:00.074) 0:00:27.577 ****** 18823 1726855035.92569: entering _queue_task() for managed_node2/fail 18823 1726855035.93317: worker is 1 (out of 1 available) 18823 1726855035.93330: exiting _queue_task() for managed_node2/fail 18823 1726855035.93344: done queuing things up, now waiting for results queue to drain 18823 1726855035.93345: waiting for pending results... 18823 1726855035.94046: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18823 1726855035.94093: in run() - task 0affcc66-ac2b-d391-077c-000000000040 18823 1726855035.94114: variable 'ansible_search_path' from source: unknown 18823 1726855035.94122: variable 'ansible_search_path' from source: unknown 18823 1726855035.94166: calling self._execute() 18823 1726855035.94265: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855035.94277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855035.94294: variable 'omit' from source: magic vars 18823 1726855035.94681: variable 'ansible_distribution_major_version' from source: facts 18823 1726855035.94701: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855035.94929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855035.97198: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855035.97257: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855035.97304: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855035.97393: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855035.97396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855035.97457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855035.97507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855035.97536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855035.97582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855035.97601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855035.97700: variable 'ansible_distribution_major_version' from source: facts 18823 1726855035.97718: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18823 1726855035.97835: variable 'ansible_distribution' from source: facts 18823 1726855035.97888: variable '__network_rh_distros' from source: role '' defaults 18823 1726855035.97893: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18823 1726855035.98094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855035.98128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855035.98192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855035.98210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855035.98232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855035.98283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855035.98393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855035.98396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855035.98398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855035.98410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855035.98462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855035.98496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855035.98532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855035.98576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855035.98599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855035.98936: variable 'network_connections' from source: play vars 18823 1726855035.98962: variable 'profile' from source: play vars 18823 1726855035.99071: variable 'profile' from source: play vars 18823 1726855035.99075: variable 'interface' from source: set_fact 18823 1726855035.99113: variable 'interface' from source: set_fact 18823 1726855035.99129: variable 'network_state' from source: role '' defaults 18823 1726855035.99210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855035.99394: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855035.99592: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855035.99595: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855035.99597: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855035.99599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855035.99608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855035.99610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855035.99612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855035.99630: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18823 1726855035.99635: when evaluation is False, skipping this task 18823 1726855035.99641: _execute() done 18823 1726855035.99647: dumping result to json 18823 1726855035.99652: done dumping result, returning 18823 1726855035.99662: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-d391-077c-000000000040] 18823 1726855035.99670: sending task result for task 0affcc66-ac2b-d391-077c-000000000040 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18823 1726855035.99883: no more pending results, returning what we have 18823 1726855035.99886: results queue empty 18823 1726855035.99889: checking for any_errors_fatal 18823 1726855035.99897: done checking for any_errors_fatal 18823 1726855035.99897: checking for max_fail_percentage 18823 1726855035.99899: done checking for max_fail_percentage 18823 1726855035.99900: checking to see if all hosts have failed and the running result is not ok 18823 1726855035.99901: done checking to see if all hosts have failed 18823 1726855035.99901: getting the remaining hosts for this loop 18823 1726855035.99904: done getting the remaining hosts for this loop 18823 1726855035.99907: getting the next task for host managed_node2 18823 1726855035.99915: done getting next task for host managed_node2 18823 1726855035.99919: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18823 1726855035.99921: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855035.99935: getting variables 18823 1726855035.99937: in VariableManager get_vars() 18823 1726855036.00030: Calling all_inventory to load vars for managed_node2 18823 1726855036.00034: Calling groups_inventory to load vars for managed_node2 18823 1726855036.00036: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.00173: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.00177: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.00180: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.00785: done sending task result for task 0affcc66-ac2b-d391-077c-000000000040 18823 1726855036.00792: WORKER PROCESS EXITING 18823 1726855036.01723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.03321: done with get_vars() 18823 1726855036.03345: done getting variables 18823 1726855036.03402: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:57:16 -0400 (0:00:00.108) 0:00:27.686 ****** 18823 1726855036.03435: entering _queue_task() for managed_node2/dnf 18823 1726855036.03821: worker is 1 (out of 1 available) 18823 1726855036.03834: exiting _queue_task() for managed_node2/dnf 18823 1726855036.03845: done queuing things up, now waiting for results queue to drain 18823 1726855036.03846: waiting for pending results... 18823 1726855036.04104: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18823 1726855036.04232: in run() - task 0affcc66-ac2b-d391-077c-000000000041 18823 1726855036.04252: variable 'ansible_search_path' from source: unknown 18823 1726855036.04260: variable 'ansible_search_path' from source: unknown 18823 1726855036.04313: calling self._execute() 18823 1726855036.04414: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.04425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.04443: variable 'omit' from source: magic vars 18823 1726855036.04832: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.04952: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.05055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855036.07342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855036.07468: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855036.07472: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855036.07513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855036.07544: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855036.07650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.07692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.07726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.07796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.07800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.07930: variable 'ansible_distribution' from source: facts 18823 1726855036.07940: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.07994: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18823 1726855036.08097: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855036.08250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.08280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.08314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.08368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.08592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.08595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.08597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.08599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.08600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.08602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.08604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.08606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.08608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.08648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.08663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.08828: variable 'network_connections' from source: play vars 18823 1726855036.08847: variable 'profile' from source: play vars 18823 1726855036.08919: variable 'profile' from source: play vars 18823 1726855036.08930: variable 'interface' from source: set_fact 18823 1726855036.08997: variable 'interface' from source: set_fact 18823 1726855036.09070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855036.09272: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855036.09316: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855036.09352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855036.09397: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855036.09445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855036.09472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855036.09596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.09599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855036.09602: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855036.10224: variable 'network_connections' from source: play vars 18823 1726855036.10236: variable 'profile' from source: play vars 18823 1726855036.10312: variable 'profile' from source: play vars 18823 1726855036.10322: variable 'interface' from source: set_fact 18823 1726855036.10394: variable 'interface' from source: set_fact 18823 1726855036.10427: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855036.10435: when evaluation is False, skipping this task 18823 1726855036.10442: _execute() done 18823 1726855036.10450: dumping result to json 18823 1726855036.10463: done dumping result, returning 18823 1726855036.10479: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000041] 18823 1726855036.10490: sending task result for task 0affcc66-ac2b-d391-077c-000000000041 18823 1726855036.10650: done sending task result for task 0affcc66-ac2b-d391-077c-000000000041 18823 1726855036.10653: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855036.10729: no more pending results, returning what we have 18823 1726855036.10733: results queue empty 18823 1726855036.10734: checking for any_errors_fatal 18823 1726855036.10740: done checking for any_errors_fatal 18823 1726855036.10741: checking for max_fail_percentage 18823 1726855036.10743: done checking for max_fail_percentage 18823 1726855036.10744: checking to see if all hosts have failed and the running result is not ok 18823 1726855036.10744: done checking to see if all hosts have failed 18823 1726855036.10745: getting the remaining hosts for this loop 18823 1726855036.10747: done getting the remaining hosts for this loop 18823 1726855036.10751: getting the next task for host managed_node2 18823 1726855036.10758: done getting next task for host managed_node2 18823 1726855036.10761: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18823 1726855036.10763: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855036.10776: getting variables 18823 1726855036.10778: in VariableManager get_vars() 18823 1726855036.10820: Calling all_inventory to load vars for managed_node2 18823 1726855036.10823: Calling groups_inventory to load vars for managed_node2 18823 1726855036.10825: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.10836: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.10839: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.10841: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.13514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.15098: done with get_vars() 18823 1726855036.15130: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18823 1726855036.15199: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:57:16 -0400 (0:00:00.117) 0:00:27.803 ****** 18823 1726855036.15232: entering _queue_task() for managed_node2/yum 18823 1726855036.15815: worker is 1 (out of 1 available) 18823 1726855036.15824: exiting _queue_task() for managed_node2/yum 18823 1726855036.15834: done queuing things up, now waiting for results queue to drain 18823 1726855036.15835: waiting for pending results... 18823 1726855036.15965: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18823 1726855036.16035: in run() - task 0affcc66-ac2b-d391-077c-000000000042 18823 1726855036.16171: variable 'ansible_search_path' from source: unknown 18823 1726855036.16175: variable 'ansible_search_path' from source: unknown 18823 1726855036.16178: calling self._execute() 18823 1726855036.16209: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.16220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.16236: variable 'omit' from source: magic vars 18823 1726855036.16617: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.16635: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.16819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855036.20397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855036.20493: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855036.20580: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855036.20682: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855036.20974: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855036.20978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.21115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.21148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.21236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.21315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.21476: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.21568: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18823 1726855036.21578: when evaluation is False, skipping this task 18823 1726855036.21590: _execute() done 18823 1726855036.21600: dumping result to json 18823 1726855036.21627: done dumping result, returning 18823 1726855036.21640: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000042] 18823 1726855036.21667: sending task result for task 0affcc66-ac2b-d391-077c-000000000042 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18823 1726855036.21879: no more pending results, returning what we have 18823 1726855036.21883: results queue empty 18823 1726855036.21884: checking for any_errors_fatal 18823 1726855036.21894: done checking for any_errors_fatal 18823 1726855036.21895: checking for max_fail_percentage 18823 1726855036.21898: done checking for max_fail_percentage 18823 1726855036.21899: checking to see if all hosts have failed and the running result is not ok 18823 1726855036.21899: done checking to see if all hosts have failed 18823 1726855036.21900: getting the remaining hosts for this loop 18823 1726855036.21902: done getting the remaining hosts for this loop 18823 1726855036.21906: getting the next task for host managed_node2 18823 1726855036.21914: done getting next task for host managed_node2 18823 1726855036.21918: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18823 1726855036.21920: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855036.21934: getting variables 18823 1726855036.21935: in VariableManager get_vars() 18823 1726855036.21975: Calling all_inventory to load vars for managed_node2 18823 1726855036.21978: Calling groups_inventory to load vars for managed_node2 18823 1726855036.21981: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.22329: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.22334: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.22339: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.23036: done sending task result for task 0affcc66-ac2b-d391-077c-000000000042 18823 1726855036.23040: WORKER PROCESS EXITING 18823 1726855036.24263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.28869: done with get_vars() 18823 1726855036.28936: done getting variables 18823 1726855036.29131: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:57:16 -0400 (0:00:00.139) 0:00:27.943 ****** 18823 1726855036.29166: entering _queue_task() for managed_node2/fail 18823 1726855036.30212: worker is 1 (out of 1 available) 18823 1726855036.30223: exiting _queue_task() for managed_node2/fail 18823 1726855036.30233: done queuing things up, now waiting for results queue to drain 18823 1726855036.30234: waiting for pending results... 18823 1726855036.30552: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18823 1726855036.30723: in run() - task 0affcc66-ac2b-d391-077c-000000000043 18823 1726855036.30793: variable 'ansible_search_path' from source: unknown 18823 1726855036.30800: variable 'ansible_search_path' from source: unknown 18823 1726855036.30805: calling self._execute() 18823 1726855036.30920: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.30931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.31295: variable 'omit' from source: magic vars 18823 1726855036.31780: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.31857: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.32141: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855036.32341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855036.35055: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855036.35132: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855036.35172: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855036.35220: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855036.35253: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855036.35354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.35391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.35431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.35474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.35496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.35546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.35570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.35595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.35632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.35655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.35698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.35726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.35761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.35806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.35825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.36016: variable 'network_connections' from source: play vars 18823 1726855036.36034: variable 'profile' from source: play vars 18823 1726855036.36127: variable 'profile' from source: play vars 18823 1726855036.36135: variable 'interface' from source: set_fact 18823 1726855036.36201: variable 'interface' from source: set_fact 18823 1726855036.36272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855036.36493: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855036.36625: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855036.36628: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855036.36630: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855036.36684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855036.36714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855036.36747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.36773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855036.36823: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855036.37166: variable 'network_connections' from source: play vars 18823 1726855036.37181: variable 'profile' from source: play vars 18823 1726855036.37246: variable 'profile' from source: play vars 18823 1726855036.37263: variable 'interface' from source: set_fact 18823 1726855036.37370: variable 'interface' from source: set_fact 18823 1726855036.37374: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855036.37377: when evaluation is False, skipping this task 18823 1726855036.37386: _execute() done 18823 1726855036.37393: dumping result to json 18823 1726855036.37402: done dumping result, returning 18823 1726855036.37415: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000043] 18823 1726855036.37494: sending task result for task 0affcc66-ac2b-d391-077c-000000000043 18823 1726855036.37574: done sending task result for task 0affcc66-ac2b-d391-077c-000000000043 18823 1726855036.37577: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855036.37661: no more pending results, returning what we have 18823 1726855036.37664: results queue empty 18823 1726855036.37665: checking for any_errors_fatal 18823 1726855036.37671: done checking for any_errors_fatal 18823 1726855036.37672: checking for max_fail_percentage 18823 1726855036.37674: done checking for max_fail_percentage 18823 1726855036.37675: checking to see if all hosts have failed and the running result is not ok 18823 1726855036.37676: done checking to see if all hosts have failed 18823 1726855036.37677: getting the remaining hosts for this loop 18823 1726855036.37678: done getting the remaining hosts for this loop 18823 1726855036.37682: getting the next task for host managed_node2 18823 1726855036.37695: done getting next task for host managed_node2 18823 1726855036.37699: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18823 1726855036.37702: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855036.37805: getting variables 18823 1726855036.37808: in VariableManager get_vars() 18823 1726855036.37850: Calling all_inventory to load vars for managed_node2 18823 1726855036.37853: Calling groups_inventory to load vars for managed_node2 18823 1726855036.37856: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.37870: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.37874: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.37877: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.41310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.43441: done with get_vars() 18823 1726855036.43485: done getting variables 18823 1726855036.43555: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:57:16 -0400 (0:00:00.144) 0:00:28.087 ****** 18823 1726855036.43611: entering _queue_task() for managed_node2/package 18823 1726855036.43984: worker is 1 (out of 1 available) 18823 1726855036.44208: exiting _queue_task() for managed_node2/package 18823 1726855036.44219: done queuing things up, now waiting for results queue to drain 18823 1726855036.44220: waiting for pending results... 18823 1726855036.44458: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 18823 1726855036.44462: in run() - task 0affcc66-ac2b-d391-077c-000000000044 18823 1726855036.44479: variable 'ansible_search_path' from source: unknown 18823 1726855036.44488: variable 'ansible_search_path' from source: unknown 18823 1726855036.44531: calling self._execute() 18823 1726855036.44639: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.44652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.44673: variable 'omit' from source: magic vars 18823 1726855036.45097: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.45106: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.45300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855036.45645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855036.45648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855036.45655: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855036.45727: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855036.45845: variable 'network_packages' from source: role '' defaults 18823 1726855036.45949: variable '__network_provider_setup' from source: role '' defaults 18823 1726855036.45967: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855036.46082: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855036.46085: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855036.46125: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855036.46324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855036.48575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855036.48656: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855036.48793: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855036.48798: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855036.48801: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855036.48868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.48907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.48943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.48992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.49017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.49072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.49108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.49144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.49244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.49247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.49457: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18823 1726855036.49584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.49683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.49686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.49701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.49725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.49826: variable 'ansible_python' from source: facts 18823 1726855036.49857: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18823 1726855036.49948: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855036.50041: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855036.50198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.50235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.50266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.50338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.50342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.50389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.50448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.50466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.50558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.50561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.50682: variable 'network_connections' from source: play vars 18823 1726855036.50695: variable 'profile' from source: play vars 18823 1726855036.50798: variable 'profile' from source: play vars 18823 1726855036.50812: variable 'interface' from source: set_fact 18823 1726855036.50881: variable 'interface' from source: set_fact 18823 1726855036.50957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855036.50993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855036.51192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.51195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855036.51197: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855036.51370: variable 'network_connections' from source: play vars 18823 1726855036.51381: variable 'profile' from source: play vars 18823 1726855036.51496: variable 'profile' from source: play vars 18823 1726855036.51511: variable 'interface' from source: set_fact 18823 1726855036.51585: variable 'interface' from source: set_fact 18823 1726855036.51632: variable '__network_packages_default_wireless' from source: role '' defaults 18823 1726855036.51718: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855036.52075: variable 'network_connections' from source: play vars 18823 1726855036.52078: variable 'profile' from source: play vars 18823 1726855036.52132: variable 'profile' from source: play vars 18823 1726855036.52141: variable 'interface' from source: set_fact 18823 1726855036.52249: variable 'interface' from source: set_fact 18823 1726855036.52279: variable '__network_packages_default_team' from source: role '' defaults 18823 1726855036.52401: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855036.52676: variable 'network_connections' from source: play vars 18823 1726855036.52689: variable 'profile' from source: play vars 18823 1726855036.52763: variable 'profile' from source: play vars 18823 1726855036.52772: variable 'interface' from source: set_fact 18823 1726855036.52946: variable 'interface' from source: set_fact 18823 1726855036.52950: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855036.52995: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855036.53010: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855036.53074: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855036.53285: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18823 1726855036.53766: variable 'network_connections' from source: play vars 18823 1726855036.53777: variable 'profile' from source: play vars 18823 1726855036.53852: variable 'profile' from source: play vars 18823 1726855036.53862: variable 'interface' from source: set_fact 18823 1726855036.53938: variable 'interface' from source: set_fact 18823 1726855036.53952: variable 'ansible_distribution' from source: facts 18823 1726855036.53961: variable '__network_rh_distros' from source: role '' defaults 18823 1726855036.53971: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.53996: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18823 1726855036.54177: variable 'ansible_distribution' from source: facts 18823 1726855036.54192: variable '__network_rh_distros' from source: role '' defaults 18823 1726855036.54256: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.54260: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18823 1726855036.54396: variable 'ansible_distribution' from source: facts 18823 1726855036.54409: variable '__network_rh_distros' from source: role '' defaults 18823 1726855036.54420: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.54461: variable 'network_provider' from source: set_fact 18823 1726855036.54489: variable 'ansible_facts' from source: unknown 18823 1726855036.55282: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18823 1726855036.55293: when evaluation is False, skipping this task 18823 1726855036.55300: _execute() done 18823 1726855036.55311: dumping result to json 18823 1726855036.55346: done dumping result, returning 18823 1726855036.55350: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-d391-077c-000000000044] 18823 1726855036.55352: sending task result for task 0affcc66-ac2b-d391-077c-000000000044 18823 1726855036.55695: done sending task result for task 0affcc66-ac2b-d391-077c-000000000044 18823 1726855036.55699: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18823 1726855036.55793: no more pending results, returning what we have 18823 1726855036.55796: results queue empty 18823 1726855036.55797: checking for any_errors_fatal 18823 1726855036.55805: done checking for any_errors_fatal 18823 1726855036.55806: checking for max_fail_percentage 18823 1726855036.55808: done checking for max_fail_percentage 18823 1726855036.55809: checking to see if all hosts have failed and the running result is not ok 18823 1726855036.55810: done checking to see if all hosts have failed 18823 1726855036.55810: getting the remaining hosts for this loop 18823 1726855036.55812: done getting the remaining hosts for this loop 18823 1726855036.55816: getting the next task for host managed_node2 18823 1726855036.55823: done getting next task for host managed_node2 18823 1726855036.55826: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18823 1726855036.55828: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855036.55842: getting variables 18823 1726855036.55844: in VariableManager get_vars() 18823 1726855036.55882: Calling all_inventory to load vars for managed_node2 18823 1726855036.55885: Calling groups_inventory to load vars for managed_node2 18823 1726855036.55891: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.55910: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.55914: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.55918: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.57648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.60357: done with get_vars() 18823 1726855036.60401: done getting variables 18823 1726855036.60466: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:57:16 -0400 (0:00:00.168) 0:00:28.256 ****** 18823 1726855036.60501: entering _queue_task() for managed_node2/package 18823 1726855036.60850: worker is 1 (out of 1 available) 18823 1726855036.60863: exiting _queue_task() for managed_node2/package 18823 1726855036.60876: done queuing things up, now waiting for results queue to drain 18823 1726855036.60877: waiting for pending results... 18823 1726855036.61161: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18823 1726855036.61280: in run() - task 0affcc66-ac2b-d391-077c-000000000045 18823 1726855036.61303: variable 'ansible_search_path' from source: unknown 18823 1726855036.61317: variable 'ansible_search_path' from source: unknown 18823 1726855036.61360: calling self._execute() 18823 1726855036.61501: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.61513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.61530: variable 'omit' from source: magic vars 18823 1726855036.61917: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.61936: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.62056: variable 'network_state' from source: role '' defaults 18823 1726855036.62071: Evaluated conditional (network_state != {}): False 18823 1726855036.62078: when evaluation is False, skipping this task 18823 1726855036.62085: _execute() done 18823 1726855036.62096: dumping result to json 18823 1726855036.62104: done dumping result, returning 18823 1726855036.62116: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-d391-077c-000000000045] 18823 1726855036.62126: sending task result for task 0affcc66-ac2b-d391-077c-000000000045 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855036.62271: no more pending results, returning what we have 18823 1726855036.62275: results queue empty 18823 1726855036.62277: checking for any_errors_fatal 18823 1726855036.62284: done checking for any_errors_fatal 18823 1726855036.62285: checking for max_fail_percentage 18823 1726855036.62390: done checking for max_fail_percentage 18823 1726855036.62392: checking to see if all hosts have failed and the running result is not ok 18823 1726855036.62393: done checking to see if all hosts have failed 18823 1726855036.62394: getting the remaining hosts for this loop 18823 1726855036.62396: done getting the remaining hosts for this loop 18823 1726855036.62400: getting the next task for host managed_node2 18823 1726855036.62407: done getting next task for host managed_node2 18823 1726855036.62413: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18823 1726855036.62415: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855036.62432: getting variables 18823 1726855036.62434: in VariableManager get_vars() 18823 1726855036.62472: Calling all_inventory to load vars for managed_node2 18823 1726855036.62475: Calling groups_inventory to load vars for managed_node2 18823 1726855036.62478: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.62693: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.62698: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.62702: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.63402: done sending task result for task 0affcc66-ac2b-d391-077c-000000000045 18823 1726855036.63406: WORKER PROCESS EXITING 18823 1726855036.64132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.65981: done with get_vars() 18823 1726855036.66119: done getting variables 18823 1726855036.66181: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:57:16 -0400 (0:00:00.059) 0:00:28.315 ****** 18823 1726855036.66416: entering _queue_task() for managed_node2/package 18823 1726855036.66856: worker is 1 (out of 1 available) 18823 1726855036.66870: exiting _queue_task() for managed_node2/package 18823 1726855036.66881: done queuing things up, now waiting for results queue to drain 18823 1726855036.66882: waiting for pending results... 18823 1726855036.67207: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18823 1726855036.67301: in run() - task 0affcc66-ac2b-d391-077c-000000000046 18823 1726855036.67505: variable 'ansible_search_path' from source: unknown 18823 1726855036.67509: variable 'ansible_search_path' from source: unknown 18823 1726855036.67549: calling self._execute() 18823 1726855036.67665: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.67670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.67673: variable 'omit' from source: magic vars 18823 1726855036.68097: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.68103: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.68204: variable 'network_state' from source: role '' defaults 18823 1726855036.68222: Evaluated conditional (network_state != {}): False 18823 1726855036.68228: when evaluation is False, skipping this task 18823 1726855036.68234: _execute() done 18823 1726855036.68241: dumping result to json 18823 1726855036.68247: done dumping result, returning 18823 1726855036.68294: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-d391-077c-000000000046] 18823 1726855036.68297: sending task result for task 0affcc66-ac2b-d391-077c-000000000046 18823 1726855036.68444: done sending task result for task 0affcc66-ac2b-d391-077c-000000000046 18823 1726855036.68448: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855036.68544: no more pending results, returning what we have 18823 1726855036.68549: results queue empty 18823 1726855036.68550: checking for any_errors_fatal 18823 1726855036.68557: done checking for any_errors_fatal 18823 1726855036.68558: checking for max_fail_percentage 18823 1726855036.68560: done checking for max_fail_percentage 18823 1726855036.68561: checking to see if all hosts have failed and the running result is not ok 18823 1726855036.68562: done checking to see if all hosts have failed 18823 1726855036.68563: getting the remaining hosts for this loop 18823 1726855036.68565: done getting the remaining hosts for this loop 18823 1726855036.68570: getting the next task for host managed_node2 18823 1726855036.68578: done getting next task for host managed_node2 18823 1726855036.68582: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18823 1726855036.68585: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855036.68605: getting variables 18823 1726855036.68607: in VariableManager get_vars() 18823 1726855036.68650: Calling all_inventory to load vars for managed_node2 18823 1726855036.68653: Calling groups_inventory to load vars for managed_node2 18823 1726855036.68656: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.68670: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.68673: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.68677: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.70342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.78332: done with get_vars() 18823 1726855036.78393: done getting variables 18823 1726855036.78486: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:57:16 -0400 (0:00:00.121) 0:00:28.437 ****** 18823 1726855036.78601: entering _queue_task() for managed_node2/service 18823 1726855036.79062: worker is 1 (out of 1 available) 18823 1726855036.79074: exiting _queue_task() for managed_node2/service 18823 1726855036.79202: done queuing things up, now waiting for results queue to drain 18823 1726855036.79205: waiting for pending results... 18823 1726855036.79440: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18823 1726855036.79530: in run() - task 0affcc66-ac2b-d391-077c-000000000047 18823 1726855036.79539: variable 'ansible_search_path' from source: unknown 18823 1726855036.79541: variable 'ansible_search_path' from source: unknown 18823 1726855036.79638: calling self._execute() 18823 1726855036.79694: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.79709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.79725: variable 'omit' from source: magic vars 18823 1726855036.80254: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.80273: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.80416: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855036.80722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855036.82999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855036.83089: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855036.83145: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855036.83190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855036.83225: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855036.83320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.83364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.83399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.83449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.83572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.83575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.83578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.83599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.83647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.83675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.83738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.83780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.83822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.83880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.83913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.84131: variable 'network_connections' from source: play vars 18823 1726855036.84192: variable 'profile' from source: play vars 18823 1726855036.84247: variable 'profile' from source: play vars 18823 1726855036.84257: variable 'interface' from source: set_fact 18823 1726855036.84329: variable 'interface' from source: set_fact 18823 1726855036.84420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855036.84626: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855036.84679: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855036.84785: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855036.84789: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855036.84815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855036.84843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855036.84873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.84917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855036.84972: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855036.85337: variable 'network_connections' from source: play vars 18823 1726855036.85341: variable 'profile' from source: play vars 18823 1726855036.85343: variable 'profile' from source: play vars 18823 1726855036.85346: variable 'interface' from source: set_fact 18823 1726855036.85412: variable 'interface' from source: set_fact 18823 1726855036.85448: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855036.85460: when evaluation is False, skipping this task 18823 1726855036.85468: _execute() done 18823 1726855036.85476: dumping result to json 18823 1726855036.85484: done dumping result, returning 18823 1726855036.85501: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000047] 18823 1726855036.85553: sending task result for task 0affcc66-ac2b-d391-077c-000000000047 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855036.85694: no more pending results, returning what we have 18823 1726855036.85697: results queue empty 18823 1726855036.85698: checking for any_errors_fatal 18823 1726855036.85707: done checking for any_errors_fatal 18823 1726855036.85708: checking for max_fail_percentage 18823 1726855036.85710: done checking for max_fail_percentage 18823 1726855036.85711: checking to see if all hosts have failed and the running result is not ok 18823 1726855036.85712: done checking to see if all hosts have failed 18823 1726855036.85712: getting the remaining hosts for this loop 18823 1726855036.85714: done getting the remaining hosts for this loop 18823 1726855036.85719: getting the next task for host managed_node2 18823 1726855036.85725: done getting next task for host managed_node2 18823 1726855036.85729: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18823 1726855036.85730: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855036.85743: getting variables 18823 1726855036.85745: in VariableManager get_vars() 18823 1726855036.85784: Calling all_inventory to load vars for managed_node2 18823 1726855036.85788: Calling groups_inventory to load vars for managed_node2 18823 1726855036.85791: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855036.85801: Calling all_plugins_play to load vars for managed_node2 18823 1726855036.85807: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855036.85809: Calling groups_plugins_play to load vars for managed_node2 18823 1726855036.86613: done sending task result for task 0affcc66-ac2b-d391-077c-000000000047 18823 1726855036.86617: WORKER PROCESS EXITING 18823 1726855036.87568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855036.89246: done with get_vars() 18823 1726855036.89279: done getting variables 18823 1726855036.89355: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:57:16 -0400 (0:00:00.107) 0:00:28.545 ****** 18823 1726855036.89390: entering _queue_task() for managed_node2/service 18823 1726855036.89833: worker is 1 (out of 1 available) 18823 1726855036.89847: exiting _queue_task() for managed_node2/service 18823 1726855036.89859: done queuing things up, now waiting for results queue to drain 18823 1726855036.89860: waiting for pending results... 18823 1726855036.90124: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18823 1726855036.90260: in run() - task 0affcc66-ac2b-d391-077c-000000000048 18823 1726855036.90352: variable 'ansible_search_path' from source: unknown 18823 1726855036.90356: variable 'ansible_search_path' from source: unknown 18823 1726855036.90359: calling self._execute() 18823 1726855036.90441: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855036.90461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855036.90480: variable 'omit' from source: magic vars 18823 1726855036.90902: variable 'ansible_distribution_major_version' from source: facts 18823 1726855036.90925: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855036.91090: variable 'network_provider' from source: set_fact 18823 1726855036.91101: variable 'network_state' from source: role '' defaults 18823 1726855036.91123: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18823 1726855036.91192: variable 'omit' from source: magic vars 18823 1726855036.91195: variable 'omit' from source: magic vars 18823 1726855036.91197: variable 'network_service_name' from source: role '' defaults 18823 1726855036.91275: variable 'network_service_name' from source: role '' defaults 18823 1726855036.91390: variable '__network_provider_setup' from source: role '' defaults 18823 1726855036.91400: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855036.91469: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855036.91481: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855036.91551: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855036.91779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855036.93974: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855036.94067: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855036.94154: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855036.94157: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855036.94196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855036.94299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.94339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.94385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.94592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.94596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.94599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.94601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.94606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.94608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.94623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.94876: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18823 1726855036.95012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.95042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.95079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.95129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.95149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.95251: variable 'ansible_python' from source: facts 18823 1726855036.95289: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18823 1726855036.95384: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855036.95470: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855036.95708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.95712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.95715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.95717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.95739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.95790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855036.95837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855036.95867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.95923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855036.95945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855036.96093: variable 'network_connections' from source: play vars 18823 1726855036.96110: variable 'profile' from source: play vars 18823 1726855036.96194: variable 'profile' from source: play vars 18823 1726855036.96210: variable 'interface' from source: set_fact 18823 1726855036.96277: variable 'interface' from source: set_fact 18823 1726855036.96384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855036.96992: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855036.96998: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855036.97023: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855036.97066: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855036.97147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855036.97183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855036.97235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855036.97272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855036.97334: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855036.97660: variable 'network_connections' from source: play vars 18823 1726855036.97664: variable 'profile' from source: play vars 18823 1726855036.97733: variable 'profile' from source: play vars 18823 1726855036.97745: variable 'interface' from source: set_fact 18823 1726855036.97878: variable 'interface' from source: set_fact 18823 1726855036.97881: variable '__network_packages_default_wireless' from source: role '' defaults 18823 1726855036.97942: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855036.98250: variable 'network_connections' from source: play vars 18823 1726855036.98259: variable 'profile' from source: play vars 18823 1726855036.98335: variable 'profile' from source: play vars 18823 1726855036.98345: variable 'interface' from source: set_fact 18823 1726855036.98426: variable 'interface' from source: set_fact 18823 1726855036.98453: variable '__network_packages_default_team' from source: role '' defaults 18823 1726855036.98537: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855036.98840: variable 'network_connections' from source: play vars 18823 1726855036.98855: variable 'profile' from source: play vars 18823 1726855036.98964: variable 'profile' from source: play vars 18823 1726855036.98967: variable 'interface' from source: set_fact 18823 1726855036.99025: variable 'interface' from source: set_fact 18823 1726855036.99095: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855036.99159: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855036.99170: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855036.99242: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855036.99472: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18823 1726855036.99999: variable 'network_connections' from source: play vars 18823 1726855037.00014: variable 'profile' from source: play vars 18823 1726855037.00085: variable 'profile' from source: play vars 18823 1726855037.00099: variable 'interface' from source: set_fact 18823 1726855037.00180: variable 'interface' from source: set_fact 18823 1726855037.00392: variable 'ansible_distribution' from source: facts 18823 1726855037.00395: variable '__network_rh_distros' from source: role '' defaults 18823 1726855037.00397: variable 'ansible_distribution_major_version' from source: facts 18823 1726855037.00399: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18823 1726855037.00412: variable 'ansible_distribution' from source: facts 18823 1726855037.00422: variable '__network_rh_distros' from source: role '' defaults 18823 1726855037.00431: variable 'ansible_distribution_major_version' from source: facts 18823 1726855037.00449: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18823 1726855037.00639: variable 'ansible_distribution' from source: facts 18823 1726855037.00647: variable '__network_rh_distros' from source: role '' defaults 18823 1726855037.00657: variable 'ansible_distribution_major_version' from source: facts 18823 1726855037.00700: variable 'network_provider' from source: set_fact 18823 1726855037.00739: variable 'omit' from source: magic vars 18823 1726855037.00773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855037.00811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855037.00843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855037.00867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855037.00882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855037.00922: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855037.00931: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855037.00939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855037.01168: Set connection var ansible_timeout to 10 18823 1726855037.01172: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855037.01175: Set connection var ansible_shell_type to sh 18823 1726855037.01176: Set connection var ansible_shell_executable to /bin/sh 18823 1726855037.01178: Set connection var ansible_connection to ssh 18823 1726855037.01180: Set connection var ansible_pipelining to False 18823 1726855037.01182: variable 'ansible_shell_executable' from source: unknown 18823 1726855037.01183: variable 'ansible_connection' from source: unknown 18823 1726855037.01185: variable 'ansible_module_compression' from source: unknown 18823 1726855037.01189: variable 'ansible_shell_type' from source: unknown 18823 1726855037.01191: variable 'ansible_shell_executable' from source: unknown 18823 1726855037.01193: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855037.01200: variable 'ansible_pipelining' from source: unknown 18823 1726855037.01202: variable 'ansible_timeout' from source: unknown 18823 1726855037.01207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855037.01389: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855037.01393: variable 'omit' from source: magic vars 18823 1726855037.01395: starting attempt loop 18823 1726855037.01400: running the handler 18823 1726855037.01427: variable 'ansible_facts' from source: unknown 18823 1726855037.02695: _low_level_execute_command(): starting 18823 1726855037.02699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855037.03750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855037.03760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855037.03786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855037.03894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855037.05742: stdout chunk (state=3): >>>/root <<< 18823 1726855037.05746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855037.06148: stderr chunk (state=3): >>><<< 18823 1726855037.06151: stdout chunk (state=3): >>><<< 18823 1726855037.06154: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855037.06157: _low_level_execute_command(): starting 18823 1726855037.06160: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301 `" && echo ansible-tmp-1726855037.0613675-20210-216385268269301="` echo /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301 `" ) && sleep 0' 18823 1726855037.07196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855037.07200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855037.07220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855037.07295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855037.07298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855037.07301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855037.07306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855037.07437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855037.07476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855037.09412: stdout chunk (state=3): >>>ansible-tmp-1726855037.0613675-20210-216385268269301=/root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301 <<< 18823 1726855037.09598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855037.09636: stderr chunk (state=3): >>><<< 18823 1726855037.09640: stdout chunk (state=3): >>><<< 18823 1726855037.09693: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855037.0613675-20210-216385268269301=/root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855037.09696: variable 'ansible_module_compression' from source: unknown 18823 1726855037.09841: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18823 1726855037.10053: variable 'ansible_facts' from source: unknown 18823 1726855037.10592: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/AnsiballZ_systemd.py 18823 1726855037.10826: Sending initial data 18823 1726855037.10828: Sent initial data (156 bytes) 18823 1726855037.11511: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855037.11522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855037.11539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855037.11548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855037.11695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855037.13245: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855037.13328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855037.13417: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpk47avq23 /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/AnsiballZ_systemd.py <<< 18823 1726855037.13421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/AnsiballZ_systemd.py" <<< 18823 1726855037.13522: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpk47avq23" to remote "/root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/AnsiballZ_systemd.py" <<< 18823 1726855037.15841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855037.15846: stdout chunk (state=3): >>><<< 18823 1726855037.15852: stderr chunk (state=3): >>><<< 18823 1726855037.15931: done transferring module to remote 18823 1726855037.15993: _low_level_execute_command(): starting 18823 1726855037.15996: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/ /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/AnsiballZ_systemd.py && sleep 0' 18823 1726855037.16738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855037.16745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855037.16757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855037.16771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855037.17044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855037.17048: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855037.17050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855037.17053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855037.18880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855037.18884: stdout chunk (state=3): >>><<< 18823 1726855037.18892: stderr chunk (state=3): >>><<< 18823 1726855037.18911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855037.18914: _low_level_execute_command(): starting 18823 1726855037.18919: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/AnsiballZ_systemd.py && sleep 0' 18823 1726855037.19577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855037.19581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855037.19583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855037.19586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855037.19591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855037.19594: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855037.19596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855037.19598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855037.19600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855037.19607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855037.19609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855037.19616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855037.19627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855037.19635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855037.19642: stderr chunk (state=3): >>>debug2: match found <<< 18823 1726855037.19651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855037.19726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855037.19739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855037.19748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855037.19860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855037.48938: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4587520", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310260224", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1156221000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 18823 1726855037.48980: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18823 1726855037.51607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855037.51611: stdout chunk (state=3): >>><<< 18823 1726855037.51614: stderr chunk (state=3): >>><<< 18823 1726855037.51617: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4587520", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310260224", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1156221000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855037.51833: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855037.51919: _low_level_execute_command(): starting 18823 1726855037.52001: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855037.0613675-20210-216385268269301/ > /dev/null 2>&1 && sleep 0' 18823 1726855037.53222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855037.53236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855037.53305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855037.53326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855037.53564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855037.53624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855037.55585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855037.55592: stdout chunk (state=3): >>><<< 18823 1726855037.55597: stderr chunk (state=3): >>><<< 18823 1726855037.55615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855037.55622: handler run complete 18823 1726855037.55684: attempt loop complete, returning result 18823 1726855037.55691: _execute() done 18823 1726855037.55693: dumping result to json 18823 1726855037.55820: done dumping result, returning 18823 1726855037.55829: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-d391-077c-000000000048] 18823 1726855037.55833: sending task result for task 0affcc66-ac2b-d391-077c-000000000048 18823 1726855037.56440: done sending task result for task 0affcc66-ac2b-d391-077c-000000000048 18823 1726855037.56444: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855037.56503: no more pending results, returning what we have 18823 1726855037.56506: results queue empty 18823 1726855037.56508: checking for any_errors_fatal 18823 1726855037.56512: done checking for any_errors_fatal 18823 1726855037.56513: checking for max_fail_percentage 18823 1726855037.56515: done checking for max_fail_percentage 18823 1726855037.56515: checking to see if all hosts have failed and the running result is not ok 18823 1726855037.56516: done checking to see if all hosts have failed 18823 1726855037.56517: getting the remaining hosts for this loop 18823 1726855037.56518: done getting the remaining hosts for this loop 18823 1726855037.56522: getting the next task for host managed_node2 18823 1726855037.56528: done getting next task for host managed_node2 18823 1726855037.56531: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18823 1726855037.56533: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855037.56541: getting variables 18823 1726855037.56543: in VariableManager get_vars() 18823 1726855037.56593: Calling all_inventory to load vars for managed_node2 18823 1726855037.56596: Calling groups_inventory to load vars for managed_node2 18823 1726855037.56598: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855037.56608: Calling all_plugins_play to load vars for managed_node2 18823 1726855037.56611: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855037.56613: Calling groups_plugins_play to load vars for managed_node2 18823 1726855037.59497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855037.61915: done with get_vars() 18823 1726855037.61944: done getting variables 18823 1726855037.62021: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:57:17 -0400 (0:00:00.726) 0:00:29.272 ****** 18823 1726855037.62051: entering _queue_task() for managed_node2/service 18823 1726855037.62481: worker is 1 (out of 1 available) 18823 1726855037.62499: exiting _queue_task() for managed_node2/service 18823 1726855037.62513: done queuing things up, now waiting for results queue to drain 18823 1726855037.62514: waiting for pending results... 18823 1726855037.63112: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18823 1726855037.63398: in run() - task 0affcc66-ac2b-d391-077c-000000000049 18823 1726855037.63403: variable 'ansible_search_path' from source: unknown 18823 1726855037.63405: variable 'ansible_search_path' from source: unknown 18823 1726855037.63408: calling self._execute() 18823 1726855037.63615: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855037.63620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855037.63637: variable 'omit' from source: magic vars 18823 1726855037.64465: variable 'ansible_distribution_major_version' from source: facts 18823 1726855037.64476: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855037.64775: variable 'network_provider' from source: set_fact 18823 1726855037.64779: Evaluated conditional (network_provider == "nm"): True 18823 1726855037.64996: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855037.65083: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855037.65481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855037.68759: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855037.68830: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855037.68867: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855037.68904: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855037.68940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855037.69039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855037.69069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855037.69097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855037.69358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855037.69361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855037.69364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855037.69366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855037.69368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855037.69370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855037.69373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855037.69375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855037.69380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855037.69406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855037.69446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855037.69464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855037.69621: variable 'network_connections' from source: play vars 18823 1726855037.69634: variable 'profile' from source: play vars 18823 1726855037.69716: variable 'profile' from source: play vars 18823 1726855037.69719: variable 'interface' from source: set_fact 18823 1726855037.69778: variable 'interface' from source: set_fact 18823 1726855037.69854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855037.70194: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855037.70198: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855037.70200: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855037.70203: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855037.70205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855037.70207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855037.70209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855037.70236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855037.70282: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855037.70549: variable 'network_connections' from source: play vars 18823 1726855037.70555: variable 'profile' from source: play vars 18823 1726855037.70618: variable 'profile' from source: play vars 18823 1726855037.70623: variable 'interface' from source: set_fact 18823 1726855037.70684: variable 'interface' from source: set_fact 18823 1726855037.70718: Evaluated conditional (__network_wpa_supplicant_required): False 18823 1726855037.70722: when evaluation is False, skipping this task 18823 1726855037.70724: _execute() done 18823 1726855037.70741: dumping result to json 18823 1726855037.70743: done dumping result, returning 18823 1726855037.70745: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-d391-077c-000000000049] 18823 1726855037.70747: sending task result for task 0affcc66-ac2b-d391-077c-000000000049 18823 1726855037.70839: done sending task result for task 0affcc66-ac2b-d391-077c-000000000049 18823 1726855037.71132: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18823 1726855037.71172: no more pending results, returning what we have 18823 1726855037.71175: results queue empty 18823 1726855037.71176: checking for any_errors_fatal 18823 1726855037.71196: done checking for any_errors_fatal 18823 1726855037.71197: checking for max_fail_percentage 18823 1726855037.71199: done checking for max_fail_percentage 18823 1726855037.71200: checking to see if all hosts have failed and the running result is not ok 18823 1726855037.71201: done checking to see if all hosts have failed 18823 1726855037.71201: getting the remaining hosts for this loop 18823 1726855037.71205: done getting the remaining hosts for this loop 18823 1726855037.71209: getting the next task for host managed_node2 18823 1726855037.71215: done getting next task for host managed_node2 18823 1726855037.71219: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18823 1726855037.71221: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855037.71233: getting variables 18823 1726855037.71235: in VariableManager get_vars() 18823 1726855037.71270: Calling all_inventory to load vars for managed_node2 18823 1726855037.71273: Calling groups_inventory to load vars for managed_node2 18823 1726855037.71275: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855037.71284: Calling all_plugins_play to load vars for managed_node2 18823 1726855037.71289: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855037.71292: Calling groups_plugins_play to load vars for managed_node2 18823 1726855037.72620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855037.74656: done with get_vars() 18823 1726855037.74902: done getting variables 18823 1726855037.74967: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:57:17 -0400 (0:00:00.129) 0:00:29.401 ****** 18823 1726855037.75000: entering _queue_task() for managed_node2/service 18823 1726855037.75769: worker is 1 (out of 1 available) 18823 1726855037.75781: exiting _queue_task() for managed_node2/service 18823 1726855037.75795: done queuing things up, now waiting for results queue to drain 18823 1726855037.75796: waiting for pending results... 18823 1726855037.76289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 18823 1726855037.76594: in run() - task 0affcc66-ac2b-d391-077c-00000000004a 18823 1726855037.76657: variable 'ansible_search_path' from source: unknown 18823 1726855037.76666: variable 'ansible_search_path' from source: unknown 18823 1726855037.76716: calling self._execute() 18823 1726855037.76878: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855037.77016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855037.77032: variable 'omit' from source: magic vars 18823 1726855037.77949: variable 'ansible_distribution_major_version' from source: facts 18823 1726855037.78184: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855037.78328: variable 'network_provider' from source: set_fact 18823 1726855037.78339: Evaluated conditional (network_provider == "initscripts"): False 18823 1726855037.78347: when evaluation is False, skipping this task 18823 1726855037.78354: _execute() done 18823 1726855037.78361: dumping result to json 18823 1726855037.78368: done dumping result, returning 18823 1726855037.78381: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-d391-077c-00000000004a] 18823 1726855037.78394: sending task result for task 0affcc66-ac2b-d391-077c-00000000004a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855037.78651: no more pending results, returning what we have 18823 1726855037.78655: results queue empty 18823 1726855037.78656: checking for any_errors_fatal 18823 1726855037.78664: done checking for any_errors_fatal 18823 1726855037.78665: checking for max_fail_percentage 18823 1726855037.78667: done checking for max_fail_percentage 18823 1726855037.78668: checking to see if all hosts have failed and the running result is not ok 18823 1726855037.78669: done checking to see if all hosts have failed 18823 1726855037.78670: getting the remaining hosts for this loop 18823 1726855037.78672: done getting the remaining hosts for this loop 18823 1726855037.78676: getting the next task for host managed_node2 18823 1726855037.78685: done getting next task for host managed_node2 18823 1726855037.78692: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18823 1726855037.78696: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855037.78714: getting variables 18823 1726855037.78716: in VariableManager get_vars() 18823 1726855037.78760: Calling all_inventory to load vars for managed_node2 18823 1726855037.78763: Calling groups_inventory to load vars for managed_node2 18823 1726855037.78766: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855037.78779: Calling all_plugins_play to load vars for managed_node2 18823 1726855037.78783: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855037.78786: Calling groups_plugins_play to load vars for managed_node2 18823 1726855037.79701: done sending task result for task 0affcc66-ac2b-d391-077c-00000000004a 18823 1726855037.79708: WORKER PROCESS EXITING 18823 1726855037.80523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855037.82143: done with get_vars() 18823 1726855037.82172: done getting variables 18823 1726855037.82237: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:57:17 -0400 (0:00:00.072) 0:00:29.474 ****** 18823 1726855037.82268: entering _queue_task() for managed_node2/copy 18823 1726855037.82632: worker is 1 (out of 1 available) 18823 1726855037.82645: exiting _queue_task() for managed_node2/copy 18823 1726855037.82657: done queuing things up, now waiting for results queue to drain 18823 1726855037.82658: waiting for pending results... 18823 1726855037.82954: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18823 1726855037.83082: in run() - task 0affcc66-ac2b-d391-077c-00000000004b 18823 1726855037.83116: variable 'ansible_search_path' from source: unknown 18823 1726855037.83126: variable 'ansible_search_path' from source: unknown 18823 1726855037.83168: calling self._execute() 18823 1726855037.83277: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855037.83292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855037.83311: variable 'omit' from source: magic vars 18823 1726855037.83726: variable 'ansible_distribution_major_version' from source: facts 18823 1726855037.83745: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855037.83878: variable 'network_provider' from source: set_fact 18823 1726855037.83892: Evaluated conditional (network_provider == "initscripts"): False 18823 1726855037.83900: when evaluation is False, skipping this task 18823 1726855037.83910: _execute() done 18823 1726855037.83919: dumping result to json 18823 1726855037.83927: done dumping result, returning 18823 1726855037.83941: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-d391-077c-00000000004b] 18823 1726855037.83951: sending task result for task 0affcc66-ac2b-d391-077c-00000000004b 18823 1726855037.84064: done sending task result for task 0affcc66-ac2b-d391-077c-00000000004b 18823 1726855037.84071: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18823 1726855037.84140: no more pending results, returning what we have 18823 1726855037.84144: results queue empty 18823 1726855037.84145: checking for any_errors_fatal 18823 1726855037.84151: done checking for any_errors_fatal 18823 1726855037.84153: checking for max_fail_percentage 18823 1726855037.84154: done checking for max_fail_percentage 18823 1726855037.84156: checking to see if all hosts have failed and the running result is not ok 18823 1726855037.84156: done checking to see if all hosts have failed 18823 1726855037.84157: getting the remaining hosts for this loop 18823 1726855037.84159: done getting the remaining hosts for this loop 18823 1726855037.84163: getting the next task for host managed_node2 18823 1726855037.84170: done getting next task for host managed_node2 18823 1726855037.84174: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18823 1726855037.84176: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855037.84193: getting variables 18823 1726855037.84195: in VariableManager get_vars() 18823 1726855037.84238: Calling all_inventory to load vars for managed_node2 18823 1726855037.84241: Calling groups_inventory to load vars for managed_node2 18823 1726855037.84244: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855037.84257: Calling all_plugins_play to load vars for managed_node2 18823 1726855037.84261: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855037.84264: Calling groups_plugins_play to load vars for managed_node2 18823 1726855037.86991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855037.90635: done with get_vars() 18823 1726855037.90666: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:57:17 -0400 (0:00:00.084) 0:00:29.559 ****** 18823 1726855037.90757: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18823 1726855037.91524: worker is 1 (out of 1 available) 18823 1726855037.91538: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18823 1726855037.91552: done queuing things up, now waiting for results queue to drain 18823 1726855037.91553: waiting for pending results... 18823 1726855037.92073: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18823 1726855037.92209: in run() - task 0affcc66-ac2b-d391-077c-00000000004c 18823 1726855037.92213: variable 'ansible_search_path' from source: unknown 18823 1726855037.92215: variable 'ansible_search_path' from source: unknown 18823 1726855037.92337: calling self._execute() 18823 1726855037.92421: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855037.92425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855037.92450: variable 'omit' from source: magic vars 18823 1726855037.93332: variable 'ansible_distribution_major_version' from source: facts 18823 1726855037.93336: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855037.93338: variable 'omit' from source: magic vars 18823 1726855037.93340: variable 'omit' from source: magic vars 18823 1726855037.93658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855037.97976: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855037.98037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855037.98071: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855037.98309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855037.98336: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855037.98416: variable 'network_provider' from source: set_fact 18823 1726855037.98869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855037.98873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855037.98876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855037.98878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855037.98880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855037.99137: variable 'omit' from source: magic vars 18823 1726855037.99250: variable 'omit' from source: magic vars 18823 1726855037.99493: variable 'network_connections' from source: play vars 18823 1726855037.99508: variable 'profile' from source: play vars 18823 1726855037.99577: variable 'profile' from source: play vars 18823 1726855037.99581: variable 'interface' from source: set_fact 18823 1726855037.99646: variable 'interface' from source: set_fact 18823 1726855037.99994: variable 'omit' from source: magic vars 18823 1726855037.99997: variable '__lsr_ansible_managed' from source: task vars 18823 1726855037.99999: variable '__lsr_ansible_managed' from source: task vars 18823 1726855038.00165: Loaded config def from plugin (lookup/template) 18823 1726855038.00168: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18823 1726855038.00219: File lookup term: get_ansible_managed.j2 18823 1726855038.00222: variable 'ansible_search_path' from source: unknown 18823 1726855038.00225: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18823 1726855038.00240: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18823 1726855038.00264: variable 'ansible_search_path' from source: unknown 18823 1726855038.10555: variable 'ansible_managed' from source: unknown 18823 1726855038.10698: variable 'omit' from source: magic vars 18823 1726855038.10730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855038.10760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855038.10778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855038.10799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855038.10810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855038.10839: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855038.10843: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.10846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.10947: Set connection var ansible_timeout to 10 18823 1726855038.10950: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855038.10953: Set connection var ansible_shell_type to sh 18823 1726855038.10965: Set connection var ansible_shell_executable to /bin/sh 18823 1726855038.10970: Set connection var ansible_connection to ssh 18823 1726855038.10976: Set connection var ansible_pipelining to False 18823 1726855038.11007: variable 'ansible_shell_executable' from source: unknown 18823 1726855038.11011: variable 'ansible_connection' from source: unknown 18823 1726855038.11013: variable 'ansible_module_compression' from source: unknown 18823 1726855038.11016: variable 'ansible_shell_type' from source: unknown 18823 1726855038.11018: variable 'ansible_shell_executable' from source: unknown 18823 1726855038.11020: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.11022: variable 'ansible_pipelining' from source: unknown 18823 1726855038.11024: variable 'ansible_timeout' from source: unknown 18823 1726855038.11026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.11273: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855038.11284: variable 'omit' from source: magic vars 18823 1726855038.11289: starting attempt loop 18823 1726855038.11291: running the handler 18823 1726855038.11294: _low_level_execute_command(): starting 18823 1726855038.11296: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855038.12377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855038.12627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855038.12693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855038.14533: stdout chunk (state=3): >>>/root <<< 18823 1726855038.14815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855038.14819: stdout chunk (state=3): >>><<< 18823 1726855038.14821: stderr chunk (state=3): >>><<< 18823 1726855038.14824: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855038.14827: _low_level_execute_command(): starting 18823 1726855038.14829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192 `" && echo ansible-tmp-1726855038.1461856-20258-49879263475192="` echo /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192 `" ) && sleep 0' 18823 1726855038.15911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855038.15930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855038.15938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855038.16045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855038.17957: stdout chunk (state=3): >>>ansible-tmp-1726855038.1461856-20258-49879263475192=/root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192 <<< 18823 1726855038.18269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855038.18273: stdout chunk (state=3): >>><<< 18823 1726855038.18279: stderr chunk (state=3): >>><<< 18823 1726855038.18300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855038.1461856-20258-49879263475192=/root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855038.18347: variable 'ansible_module_compression' from source: unknown 18823 1726855038.18394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18823 1726855038.18423: variable 'ansible_facts' from source: unknown 18823 1726855038.18734: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/AnsiballZ_network_connections.py 18823 1726855038.19009: Sending initial data 18823 1726855038.19013: Sent initial data (167 bytes) 18823 1726855038.20111: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855038.20121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855038.20132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855038.20146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855038.20158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855038.20283: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855038.20596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855038.20600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855038.22093: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855038.22161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855038.22418: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp29kuzk8f /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/AnsiballZ_network_connections.py <<< 18823 1726855038.22432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/AnsiballZ_network_connections.py" <<< 18823 1726855038.22715: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp29kuzk8f" to remote "/root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/AnsiballZ_network_connections.py" <<< 18823 1726855038.24849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855038.24854: stderr chunk (state=3): >>><<< 18823 1726855038.24858: stdout chunk (state=3): >>><<< 18823 1726855038.24886: done transferring module to remote 18823 1726855038.24899: _low_level_execute_command(): starting 18823 1726855038.24907: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/ /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/AnsiballZ_network_connections.py && sleep 0' 18823 1726855038.26010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855038.26109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855038.26117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855038.26133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855038.26146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855038.26153: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855038.26162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855038.26177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855038.26185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855038.26193: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855038.26710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855038.26714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855038.26716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855038.26718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855038.28644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855038.28746: stderr chunk (state=3): >>><<< 18823 1726855038.28903: stdout chunk (state=3): >>><<< 18823 1726855038.28927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855038.28933: _low_level_execute_command(): starting 18823 1726855038.28939: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/AnsiballZ_network_connections.py && sleep 0' 18823 1726855038.30003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855038.30298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855038.30302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855038.30593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855038.60199: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18823 1726855038.62095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855038.62099: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 18823 1726855038.62102: stdout chunk (state=3): >>><<< 18823 1726855038.62106: stderr chunk (state=3): >>><<< 18823 1726855038.62113: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855038.62153: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855038.62160: _low_level_execute_command(): starting 18823 1726855038.62165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855038.1461856-20258-49879263475192/ > /dev/null 2>&1 && sleep 0' 18823 1726855038.62786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855038.62796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855038.62809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855038.62820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855038.62830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855038.62906: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855038.62931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855038.62948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855038.62970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855038.63079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855038.65100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855038.65107: stderr chunk (state=3): >>><<< 18823 1726855038.65110: stdout chunk (state=3): >>><<< 18823 1726855038.65112: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855038.65115: handler run complete 18823 1726855038.65117: attempt loop complete, returning result 18823 1726855038.65119: _execute() done 18823 1726855038.65121: dumping result to json 18823 1726855038.65122: done dumping result, returning 18823 1726855038.65124: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-d391-077c-00000000004c] 18823 1726855038.65126: sending task result for task 0affcc66-ac2b-d391-077c-00000000004c 18823 1726855038.65215: done sending task result for task 0affcc66-ac2b-d391-077c-00000000004c 18823 1726855038.65222: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18823 1726855038.65322: no more pending results, returning what we have 18823 1726855038.65325: results queue empty 18823 1726855038.65327: checking for any_errors_fatal 18823 1726855038.65334: done checking for any_errors_fatal 18823 1726855038.65335: checking for max_fail_percentage 18823 1726855038.65337: done checking for max_fail_percentage 18823 1726855038.65338: checking to see if all hosts have failed and the running result is not ok 18823 1726855038.65339: done checking to see if all hosts have failed 18823 1726855038.65345: getting the remaining hosts for this loop 18823 1726855038.65347: done getting the remaining hosts for this loop 18823 1726855038.65351: getting the next task for host managed_node2 18823 1726855038.65359: done getting next task for host managed_node2 18823 1726855038.65363: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18823 1726855038.65365: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855038.65375: getting variables 18823 1726855038.65378: in VariableManager get_vars() 18823 1726855038.65946: Calling all_inventory to load vars for managed_node2 18823 1726855038.65949: Calling groups_inventory to load vars for managed_node2 18823 1726855038.65952: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855038.65962: Calling all_plugins_play to load vars for managed_node2 18823 1726855038.65965: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855038.65968: Calling groups_plugins_play to load vars for managed_node2 18823 1726855038.67700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855038.69860: done with get_vars() 18823 1726855038.69895: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:57:18 -0400 (0:00:00.792) 0:00:30.351 ****** 18823 1726855038.69986: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18823 1726855038.70323: worker is 1 (out of 1 available) 18823 1726855038.70336: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18823 1726855038.70349: done queuing things up, now waiting for results queue to drain 18823 1726855038.70350: waiting for pending results... 18823 1726855038.70639: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 18823 1726855038.70761: in run() - task 0affcc66-ac2b-d391-077c-00000000004d 18823 1726855038.70784: variable 'ansible_search_path' from source: unknown 18823 1726855038.70796: variable 'ansible_search_path' from source: unknown 18823 1726855038.70840: calling self._execute() 18823 1726855038.70941: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.70952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.70993: variable 'omit' from source: magic vars 18823 1726855038.71372: variable 'ansible_distribution_major_version' from source: facts 18823 1726855038.71390: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855038.71519: variable 'network_state' from source: role '' defaults 18823 1726855038.71693: Evaluated conditional (network_state != {}): False 18823 1726855038.71696: when evaluation is False, skipping this task 18823 1726855038.71699: _execute() done 18823 1726855038.71701: dumping result to json 18823 1726855038.71706: done dumping result, returning 18823 1726855038.71708: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-d391-077c-00000000004d] 18823 1726855038.71711: sending task result for task 0affcc66-ac2b-d391-077c-00000000004d 18823 1726855038.71783: done sending task result for task 0affcc66-ac2b-d391-077c-00000000004d 18823 1726855038.71789: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855038.71846: no more pending results, returning what we have 18823 1726855038.71851: results queue empty 18823 1726855038.71852: checking for any_errors_fatal 18823 1726855038.71865: done checking for any_errors_fatal 18823 1726855038.71866: checking for max_fail_percentage 18823 1726855038.71868: done checking for max_fail_percentage 18823 1726855038.71869: checking to see if all hosts have failed and the running result is not ok 18823 1726855038.71869: done checking to see if all hosts have failed 18823 1726855038.71870: getting the remaining hosts for this loop 18823 1726855038.71872: done getting the remaining hosts for this loop 18823 1726855038.71876: getting the next task for host managed_node2 18823 1726855038.71883: done getting next task for host managed_node2 18823 1726855038.71890: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18823 1726855038.71894: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855038.71913: getting variables 18823 1726855038.71915: in VariableManager get_vars() 18823 1726855038.71953: Calling all_inventory to load vars for managed_node2 18823 1726855038.71956: Calling groups_inventory to load vars for managed_node2 18823 1726855038.71958: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855038.71970: Calling all_plugins_play to load vars for managed_node2 18823 1726855038.71973: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855038.71975: Calling groups_plugins_play to load vars for managed_node2 18823 1726855038.73740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855038.75314: done with get_vars() 18823 1726855038.75340: done getting variables 18823 1726855038.75405: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:57:18 -0400 (0:00:00.054) 0:00:30.406 ****** 18823 1726855038.75437: entering _queue_task() for managed_node2/debug 18823 1726855038.75801: worker is 1 (out of 1 available) 18823 1726855038.75817: exiting _queue_task() for managed_node2/debug 18823 1726855038.75830: done queuing things up, now waiting for results queue to drain 18823 1726855038.75831: waiting for pending results... 18823 1726855038.76219: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18823 1726855038.76244: in run() - task 0affcc66-ac2b-d391-077c-00000000004e 18823 1726855038.76265: variable 'ansible_search_path' from source: unknown 18823 1726855038.76273: variable 'ansible_search_path' from source: unknown 18823 1726855038.76325: calling self._execute() 18823 1726855038.76431: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.76443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.76459: variable 'omit' from source: magic vars 18823 1726855038.76855: variable 'ansible_distribution_major_version' from source: facts 18823 1726855038.76859: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855038.76861: variable 'omit' from source: magic vars 18823 1726855038.77294: variable 'omit' from source: magic vars 18823 1726855038.77297: variable 'omit' from source: magic vars 18823 1726855038.77299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855038.77301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855038.77306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855038.77308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855038.77309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855038.77327: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855038.77334: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.77340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.77489: Set connection var ansible_timeout to 10 18823 1726855038.77631: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855038.77639: Set connection var ansible_shell_type to sh 18823 1726855038.77649: Set connection var ansible_shell_executable to /bin/sh 18823 1726855038.77660: Set connection var ansible_connection to ssh 18823 1726855038.77679: Set connection var ansible_pipelining to False 18823 1726855038.77716: variable 'ansible_shell_executable' from source: unknown 18823 1726855038.77946: variable 'ansible_connection' from source: unknown 18823 1726855038.77949: variable 'ansible_module_compression' from source: unknown 18823 1726855038.77952: variable 'ansible_shell_type' from source: unknown 18823 1726855038.77955: variable 'ansible_shell_executable' from source: unknown 18823 1726855038.77957: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.77959: variable 'ansible_pipelining' from source: unknown 18823 1726855038.77961: variable 'ansible_timeout' from source: unknown 18823 1726855038.77963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.78191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855038.78195: variable 'omit' from source: magic vars 18823 1726855038.78198: starting attempt loop 18823 1726855038.78200: running the handler 18823 1726855038.78483: variable '__network_connections_result' from source: set_fact 18823 1726855038.78812: handler run complete 18823 1726855038.78815: attempt loop complete, returning result 18823 1726855038.78818: _execute() done 18823 1726855038.78820: dumping result to json 18823 1726855038.78822: done dumping result, returning 18823 1726855038.78824: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-d391-077c-00000000004e] 18823 1726855038.78826: sending task result for task 0affcc66-ac2b-d391-077c-00000000004e ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 18823 1726855038.78977: no more pending results, returning what we have 18823 1726855038.78980: results queue empty 18823 1726855038.78980: checking for any_errors_fatal 18823 1726855038.78986: done checking for any_errors_fatal 18823 1726855038.78986: checking for max_fail_percentage 18823 1726855038.78990: done checking for max_fail_percentage 18823 1726855038.78991: checking to see if all hosts have failed and the running result is not ok 18823 1726855038.78992: done checking to see if all hosts have failed 18823 1726855038.78992: getting the remaining hosts for this loop 18823 1726855038.78994: done getting the remaining hosts for this loop 18823 1726855038.78998: getting the next task for host managed_node2 18823 1726855038.79009: done getting next task for host managed_node2 18823 1726855038.79013: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18823 1726855038.79015: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855038.79026: getting variables 18823 1726855038.79028: in VariableManager get_vars() 18823 1726855038.79067: Calling all_inventory to load vars for managed_node2 18823 1726855038.79070: Calling groups_inventory to load vars for managed_node2 18823 1726855038.79073: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855038.79083: Calling all_plugins_play to load vars for managed_node2 18823 1726855038.79086: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855038.79395: Calling groups_plugins_play to load vars for managed_node2 18823 1726855038.80129: done sending task result for task 0affcc66-ac2b-d391-077c-00000000004e 18823 1726855038.80132: WORKER PROCESS EXITING 18823 1726855038.82333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855038.84694: done with get_vars() 18823 1726855038.84729: done getting variables 18823 1726855038.84792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:57:18 -0400 (0:00:00.093) 0:00:30.499 ****** 18823 1726855038.84826: entering _queue_task() for managed_node2/debug 18823 1726855038.85185: worker is 1 (out of 1 available) 18823 1726855038.85199: exiting _queue_task() for managed_node2/debug 18823 1726855038.85214: done queuing things up, now waiting for results queue to drain 18823 1726855038.85215: waiting for pending results... 18823 1726855038.85501: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18823 1726855038.85619: in run() - task 0affcc66-ac2b-d391-077c-00000000004f 18823 1726855038.85643: variable 'ansible_search_path' from source: unknown 18823 1726855038.85651: variable 'ansible_search_path' from source: unknown 18823 1726855038.85699: calling self._execute() 18823 1726855038.85811: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.85821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.85835: variable 'omit' from source: magic vars 18823 1726855038.86223: variable 'ansible_distribution_major_version' from source: facts 18823 1726855038.86238: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855038.86247: variable 'omit' from source: magic vars 18823 1726855038.86295: variable 'omit' from source: magic vars 18823 1726855038.86334: variable 'omit' from source: magic vars 18823 1726855038.86377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855038.86422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855038.86447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855038.86468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855038.86482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855038.86592: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855038.86597: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.86599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.86643: Set connection var ansible_timeout to 10 18823 1726855038.86655: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855038.86713: Set connection var ansible_shell_type to sh 18823 1726855038.86716: Set connection var ansible_shell_executable to /bin/sh 18823 1726855038.86718: Set connection var ansible_connection to ssh 18823 1726855038.86720: Set connection var ansible_pipelining to False 18823 1726855038.86722: variable 'ansible_shell_executable' from source: unknown 18823 1726855038.86724: variable 'ansible_connection' from source: unknown 18823 1726855038.86730: variable 'ansible_module_compression' from source: unknown 18823 1726855038.86736: variable 'ansible_shell_type' from source: unknown 18823 1726855038.86742: variable 'ansible_shell_executable' from source: unknown 18823 1726855038.86748: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.86755: variable 'ansible_pipelining' from source: unknown 18823 1726855038.86761: variable 'ansible_timeout' from source: unknown 18823 1726855038.86768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.86914: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855038.86934: variable 'omit' from source: magic vars 18823 1726855038.86944: starting attempt loop 18823 1726855038.86992: running the handler 18823 1726855038.87009: variable '__network_connections_result' from source: set_fact 18823 1726855038.87102: variable '__network_connections_result' from source: set_fact 18823 1726855038.87219: handler run complete 18823 1726855038.87246: attempt loop complete, returning result 18823 1726855038.87257: _execute() done 18823 1726855038.87264: dumping result to json 18823 1726855038.87272: done dumping result, returning 18823 1726855038.87365: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-d391-077c-00000000004f] 18823 1726855038.87368: sending task result for task 0affcc66-ac2b-d391-077c-00000000004f 18823 1726855038.87443: done sending task result for task 0affcc66-ac2b-d391-077c-00000000004f 18823 1726855038.87446: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18823 1726855038.87552: no more pending results, returning what we have 18823 1726855038.87555: results queue empty 18823 1726855038.87556: checking for any_errors_fatal 18823 1726855038.87561: done checking for any_errors_fatal 18823 1726855038.87562: checking for max_fail_percentage 18823 1726855038.87563: done checking for max_fail_percentage 18823 1726855038.87564: checking to see if all hosts have failed and the running result is not ok 18823 1726855038.87565: done checking to see if all hosts have failed 18823 1726855038.87566: getting the remaining hosts for this loop 18823 1726855038.87568: done getting the remaining hosts for this loop 18823 1726855038.87571: getting the next task for host managed_node2 18823 1726855038.87579: done getting next task for host managed_node2 18823 1726855038.87582: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18823 1726855038.87585: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855038.87596: getting variables 18823 1726855038.87598: in VariableManager get_vars() 18823 1726855038.87638: Calling all_inventory to load vars for managed_node2 18823 1726855038.87641: Calling groups_inventory to load vars for managed_node2 18823 1726855038.87643: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855038.87654: Calling all_plugins_play to load vars for managed_node2 18823 1726855038.87657: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855038.87661: Calling groups_plugins_play to load vars for managed_node2 18823 1726855038.89420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855038.92311: done with get_vars() 18823 1726855038.92342: done getting variables 18823 1726855038.92411: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:57:18 -0400 (0:00:00.076) 0:00:30.576 ****** 18823 1726855038.92449: entering _queue_task() for managed_node2/debug 18823 1726855038.93340: worker is 1 (out of 1 available) 18823 1726855038.93353: exiting _queue_task() for managed_node2/debug 18823 1726855038.93364: done queuing things up, now waiting for results queue to drain 18823 1726855038.93365: waiting for pending results... 18823 1726855038.94111: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18823 1726855038.94117: in run() - task 0affcc66-ac2b-d391-077c-000000000050 18823 1726855038.94121: variable 'ansible_search_path' from source: unknown 18823 1726855038.94123: variable 'ansible_search_path' from source: unknown 18823 1726855038.94233: calling self._execute() 18823 1726855038.94497: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855038.94515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855038.94543: variable 'omit' from source: magic vars 18823 1726855038.95392: variable 'ansible_distribution_major_version' from source: facts 18823 1726855038.95622: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855038.95733: variable 'network_state' from source: role '' defaults 18823 1726855038.95749: Evaluated conditional (network_state != {}): False 18823 1726855038.95757: when evaluation is False, skipping this task 18823 1726855038.95948: _execute() done 18823 1726855038.95952: dumping result to json 18823 1726855038.95955: done dumping result, returning 18823 1726855038.95957: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-d391-077c-000000000050] 18823 1726855038.95960: sending task result for task 0affcc66-ac2b-d391-077c-000000000050 18823 1726855038.96037: done sending task result for task 0affcc66-ac2b-d391-077c-000000000050 18823 1726855038.96041: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 18823 1726855038.96107: no more pending results, returning what we have 18823 1726855038.96112: results queue empty 18823 1726855038.96113: checking for any_errors_fatal 18823 1726855038.96123: done checking for any_errors_fatal 18823 1726855038.96124: checking for max_fail_percentage 18823 1726855038.96126: done checking for max_fail_percentage 18823 1726855038.96127: checking to see if all hosts have failed and the running result is not ok 18823 1726855038.96128: done checking to see if all hosts have failed 18823 1726855038.96129: getting the remaining hosts for this loop 18823 1726855038.96131: done getting the remaining hosts for this loop 18823 1726855038.96135: getting the next task for host managed_node2 18823 1726855038.96142: done getting next task for host managed_node2 18823 1726855038.96146: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18823 1726855038.96150: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855038.96166: getting variables 18823 1726855038.96168: in VariableManager get_vars() 18823 1726855038.96214: Calling all_inventory to load vars for managed_node2 18823 1726855038.96217: Calling groups_inventory to load vars for managed_node2 18823 1726855038.96220: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855038.96233: Calling all_plugins_play to load vars for managed_node2 18823 1726855038.96236: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855038.96239: Calling groups_plugins_play to load vars for managed_node2 18823 1726855038.99306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855039.02168: done with get_vars() 18823 1726855039.02193: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:57:19 -0400 (0:00:00.098) 0:00:30.675 ****** 18823 1726855039.02333: entering _queue_task() for managed_node2/ping 18823 1726855039.02819: worker is 1 (out of 1 available) 18823 1726855039.02831: exiting _queue_task() for managed_node2/ping 18823 1726855039.02843: done queuing things up, now waiting for results queue to drain 18823 1726855039.02844: waiting for pending results... 18823 1726855039.03127: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 18823 1726855039.03238: in run() - task 0affcc66-ac2b-d391-077c-000000000051 18823 1726855039.03256: variable 'ansible_search_path' from source: unknown 18823 1726855039.03262: variable 'ansible_search_path' from source: unknown 18823 1726855039.03305: calling self._execute() 18823 1726855039.03425: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855039.03428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855039.03431: variable 'omit' from source: magic vars 18823 1726855039.03826: variable 'ansible_distribution_major_version' from source: facts 18823 1726855039.03846: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855039.03970: variable 'omit' from source: magic vars 18823 1726855039.03974: variable 'omit' from source: magic vars 18823 1726855039.03976: variable 'omit' from source: magic vars 18823 1726855039.03996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855039.04038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855039.04065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855039.04095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855039.04115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855039.04158: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855039.04168: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855039.04176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855039.04285: Set connection var ansible_timeout to 10 18823 1726855039.04305: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855039.04315: Set connection var ansible_shell_type to sh 18823 1726855039.04326: Set connection var ansible_shell_executable to /bin/sh 18823 1726855039.04337: Set connection var ansible_connection to ssh 18823 1726855039.04347: Set connection var ansible_pipelining to False 18823 1726855039.04377: variable 'ansible_shell_executable' from source: unknown 18823 1726855039.04385: variable 'ansible_connection' from source: unknown 18823 1726855039.04443: variable 'ansible_module_compression' from source: unknown 18823 1726855039.04505: variable 'ansible_shell_type' from source: unknown 18823 1726855039.04509: variable 'ansible_shell_executable' from source: unknown 18823 1726855039.04511: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855039.04512: variable 'ansible_pipelining' from source: unknown 18823 1726855039.04514: variable 'ansible_timeout' from source: unknown 18823 1726855039.04516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855039.05050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855039.05054: variable 'omit' from source: magic vars 18823 1726855039.05057: starting attempt loop 18823 1726855039.05059: running the handler 18823 1726855039.05061: _low_level_execute_command(): starting 18823 1726855039.05063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855039.05810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.05834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.06017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.06143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.07878: stdout chunk (state=3): >>>/root <<< 18823 1726855039.08031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.08034: stdout chunk (state=3): >>><<< 18823 1726855039.08038: stderr chunk (state=3): >>><<< 18823 1726855039.08060: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855039.08082: _low_level_execute_command(): starting 18823 1726855039.08097: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462 `" && echo ansible-tmp-1726855039.0806715-20310-51133490941462="` echo /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462 `" ) && sleep 0' 18823 1726855039.09214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855039.09309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.09382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.09406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.09431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.09720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.11638: stdout chunk (state=3): >>>ansible-tmp-1726855039.0806715-20310-51133490941462=/root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462 <<< 18823 1726855039.11770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.11793: stderr chunk (state=3): >>><<< 18823 1726855039.11807: stdout chunk (state=3): >>><<< 18823 1726855039.11891: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855039.0806715-20310-51133490941462=/root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855039.12095: variable 'ansible_module_compression' from source: unknown 18823 1726855039.12099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18823 1726855039.12102: variable 'ansible_facts' from source: unknown 18823 1726855039.12612: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/AnsiballZ_ping.py 18823 1726855039.12730: Sending initial data 18823 1726855039.12740: Sent initial data (152 bytes) 18823 1726855039.13882: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855039.14206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.14248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.14342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.15925: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855039.15989: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855039.16081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpyj6yi9p1 /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/AnsiballZ_ping.py <<< 18823 1726855039.16085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/AnsiballZ_ping.py" <<< 18823 1726855039.16173: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpyj6yi9p1" to remote "/root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/AnsiballZ_ping.py" <<< 18823 1726855039.17401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.17408: stdout chunk (state=3): >>><<< 18823 1726855039.17412: stderr chunk (state=3): >>><<< 18823 1726855039.17435: done transferring module to remote 18823 1726855039.17483: _low_level_execute_command(): starting 18823 1726855039.17486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/ /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/AnsiballZ_ping.py && sleep 0' 18823 1726855039.18382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855039.18397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.18409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855039.18460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855039.18464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855039.18467: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855039.18469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.18472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855039.18474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855039.18482: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855039.18484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.18498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855039.18520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855039.18523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855039.18601: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.18610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.18705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.20794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.20798: stdout chunk (state=3): >>><<< 18823 1726855039.20801: stderr chunk (state=3): >>><<< 18823 1726855039.20803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855039.20806: _low_level_execute_command(): starting 18823 1726855039.20808: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/AnsiballZ_ping.py && sleep 0' 18823 1726855039.21285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855039.21294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.21306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855039.21324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855039.21337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855039.21343: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855039.21362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.21380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855039.21384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855039.21393: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855039.21402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.21415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855039.21469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.21506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.21521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.21559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.21638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.36488: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18823 1726855039.37895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855039.37899: stdout chunk (state=3): >>><<< 18823 1726855039.37901: stderr chunk (state=3): >>><<< 18823 1726855039.37906: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855039.37910: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855039.37912: _low_level_execute_command(): starting 18823 1726855039.37919: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855039.0806715-20310-51133490941462/ > /dev/null 2>&1 && sleep 0' 18823 1726855039.38642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855039.38646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.38648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855039.38650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855039.38652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855039.38654: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855039.38656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.38657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855039.38659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855039.38676: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.38723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.38735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.38754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.38859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.40898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.40901: stdout chunk (state=3): >>><<< 18823 1726855039.40905: stderr chunk (state=3): >>><<< 18823 1726855039.40908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855039.40910: handler run complete 18823 1726855039.40911: attempt loop complete, returning result 18823 1726855039.40913: _execute() done 18823 1726855039.40914: dumping result to json 18823 1726855039.40916: done dumping result, returning 18823 1726855039.40917: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-d391-077c-000000000051] 18823 1726855039.40919: sending task result for task 0affcc66-ac2b-d391-077c-000000000051 18823 1726855039.40977: done sending task result for task 0affcc66-ac2b-d391-077c-000000000051 18823 1726855039.40979: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 18823 1726855039.41053: no more pending results, returning what we have 18823 1726855039.41056: results queue empty 18823 1726855039.41057: checking for any_errors_fatal 18823 1726855039.41062: done checking for any_errors_fatal 18823 1726855039.41062: checking for max_fail_percentage 18823 1726855039.41064: done checking for max_fail_percentage 18823 1726855039.41065: checking to see if all hosts have failed and the running result is not ok 18823 1726855039.41066: done checking to see if all hosts have failed 18823 1726855039.41066: getting the remaining hosts for this loop 18823 1726855039.41068: done getting the remaining hosts for this loop 18823 1726855039.41071: getting the next task for host managed_node2 18823 1726855039.41078: done getting next task for host managed_node2 18823 1726855039.41080: ^ task is: TASK: meta (role_complete) 18823 1726855039.41081: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855039.41098: getting variables 18823 1726855039.41100: in VariableManager get_vars() 18823 1726855039.41135: Calling all_inventory to load vars for managed_node2 18823 1726855039.41137: Calling groups_inventory to load vars for managed_node2 18823 1726855039.41139: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855039.41149: Calling all_plugins_play to load vars for managed_node2 18823 1726855039.41152: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855039.41155: Calling groups_plugins_play to load vars for managed_node2 18823 1726855039.43883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855039.45759: done with get_vars() 18823 1726855039.45794: done getting variables 18823 1726855039.45882: done queuing things up, now waiting for results queue to drain 18823 1726855039.45884: results queue empty 18823 1726855039.45885: checking for any_errors_fatal 18823 1726855039.45890: done checking for any_errors_fatal 18823 1726855039.45892: checking for max_fail_percentage 18823 1726855039.45893: done checking for max_fail_percentage 18823 1726855039.45894: checking to see if all hosts have failed and the running result is not ok 18823 1726855039.45895: done checking to see if all hosts have failed 18823 1726855039.45896: getting the remaining hosts for this loop 18823 1726855039.45897: done getting the remaining hosts for this loop 18823 1726855039.45900: getting the next task for host managed_node2 18823 1726855039.45903: done getting next task for host managed_node2 18823 1726855039.45905: ^ task is: TASK: meta (flush_handlers) 18823 1726855039.45906: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855039.45909: getting variables 18823 1726855039.45910: in VariableManager get_vars() 18823 1726855039.45924: Calling all_inventory to load vars for managed_node2 18823 1726855039.45927: Calling groups_inventory to load vars for managed_node2 18823 1726855039.45929: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855039.45934: Calling all_plugins_play to load vars for managed_node2 18823 1726855039.45937: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855039.45939: Calling groups_plugins_play to load vars for managed_node2 18823 1726855039.47482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855039.49059: done with get_vars() 18823 1726855039.49084: done getting variables 18823 1726855039.49132: in VariableManager get_vars() 18823 1726855039.49144: Calling all_inventory to load vars for managed_node2 18823 1726855039.49146: Calling groups_inventory to load vars for managed_node2 18823 1726855039.49148: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855039.49152: Calling all_plugins_play to load vars for managed_node2 18823 1726855039.49154: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855039.49156: Calling groups_plugins_play to load vars for managed_node2 18823 1726855039.50447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855039.51935: done with get_vars() 18823 1726855039.51969: done queuing things up, now waiting for results queue to drain 18823 1726855039.51971: results queue empty 18823 1726855039.51972: checking for any_errors_fatal 18823 1726855039.51973: done checking for any_errors_fatal 18823 1726855039.51974: checking for max_fail_percentage 18823 1726855039.51975: done checking for max_fail_percentage 18823 1726855039.51975: checking to see if all hosts have failed and the running result is not ok 18823 1726855039.51976: done checking to see if all hosts have failed 18823 1726855039.51976: getting the remaining hosts for this loop 18823 1726855039.51977: done getting the remaining hosts for this loop 18823 1726855039.51980: getting the next task for host managed_node2 18823 1726855039.51983: done getting next task for host managed_node2 18823 1726855039.51991: ^ task is: TASK: meta (flush_handlers) 18823 1726855039.51993: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855039.51996: getting variables 18823 1726855039.51997: in VariableManager get_vars() 18823 1726855039.52010: Calling all_inventory to load vars for managed_node2 18823 1726855039.52012: Calling groups_inventory to load vars for managed_node2 18823 1726855039.52013: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855039.52018: Calling all_plugins_play to load vars for managed_node2 18823 1726855039.52020: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855039.52022: Calling groups_plugins_play to load vars for managed_node2 18823 1726855039.53245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855039.54751: done with get_vars() 18823 1726855039.54772: done getting variables 18823 1726855039.54821: in VariableManager get_vars() 18823 1726855039.54833: Calling all_inventory to load vars for managed_node2 18823 1726855039.54835: Calling groups_inventory to load vars for managed_node2 18823 1726855039.54836: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855039.54841: Calling all_plugins_play to load vars for managed_node2 18823 1726855039.54843: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855039.54845: Calling groups_plugins_play to load vars for managed_node2 18823 1726855039.55938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855039.57434: done with get_vars() 18823 1726855039.57467: done queuing things up, now waiting for results queue to drain 18823 1726855039.57469: results queue empty 18823 1726855039.57470: checking for any_errors_fatal 18823 1726855039.57471: done checking for any_errors_fatal 18823 1726855039.57472: checking for max_fail_percentage 18823 1726855039.57474: done checking for max_fail_percentage 18823 1726855039.57474: checking to see if all hosts have failed and the running result is not ok 18823 1726855039.57475: done checking to see if all hosts have failed 18823 1726855039.57476: getting the remaining hosts for this loop 18823 1726855039.57476: done getting the remaining hosts for this loop 18823 1726855039.57480: getting the next task for host managed_node2 18823 1726855039.57483: done getting next task for host managed_node2 18823 1726855039.57484: ^ task is: None 18823 1726855039.57485: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855039.57488: done queuing things up, now waiting for results queue to drain 18823 1726855039.57490: results queue empty 18823 1726855039.57492: checking for any_errors_fatal 18823 1726855039.57492: done checking for any_errors_fatal 18823 1726855039.57493: checking for max_fail_percentage 18823 1726855039.57494: done checking for max_fail_percentage 18823 1726855039.57495: checking to see if all hosts have failed and the running result is not ok 18823 1726855039.57495: done checking to see if all hosts have failed 18823 1726855039.57497: getting the next task for host managed_node2 18823 1726855039.57499: done getting next task for host managed_node2 18823 1726855039.57500: ^ task is: None 18823 1726855039.57501: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855039.57547: in VariableManager get_vars() 18823 1726855039.57562: done with get_vars() 18823 1726855039.57568: in VariableManager get_vars() 18823 1726855039.57576: done with get_vars() 18823 1726855039.57580: variable 'omit' from source: magic vars 18823 1726855039.57610: in VariableManager get_vars() 18823 1726855039.57620: done with get_vars() 18823 1726855039.57638: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 18823 1726855039.57880: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855039.57908: getting the remaining hosts for this loop 18823 1726855039.57909: done getting the remaining hosts for this loop 18823 1726855039.57912: getting the next task for host managed_node2 18823 1726855039.57915: done getting next task for host managed_node2 18823 1726855039.57917: ^ task is: TASK: Gathering Facts 18823 1726855039.57919: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855039.57921: getting variables 18823 1726855039.57922: in VariableManager get_vars() 18823 1726855039.57931: Calling all_inventory to load vars for managed_node2 18823 1726855039.57933: Calling groups_inventory to load vars for managed_node2 18823 1726855039.57935: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855039.57941: Calling all_plugins_play to load vars for managed_node2 18823 1726855039.57943: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855039.57946: Calling groups_plugins_play to load vars for managed_node2 18823 1726855039.59233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855039.60762: done with get_vars() 18823 1726855039.60783: done getting variables 18823 1726855039.60826: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 13:57:19 -0400 (0:00:00.585) 0:00:31.260 ****** 18823 1726855039.60852: entering _queue_task() for managed_node2/gather_facts 18823 1726855039.61191: worker is 1 (out of 1 available) 18823 1726855039.61202: exiting _queue_task() for managed_node2/gather_facts 18823 1726855039.61214: done queuing things up, now waiting for results queue to drain 18823 1726855039.61215: waiting for pending results... 18823 1726855039.61605: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855039.61610: in run() - task 0affcc66-ac2b-d391-077c-0000000003f8 18823 1726855039.61614: variable 'ansible_search_path' from source: unknown 18823 1726855039.61643: calling self._execute() 18823 1726855039.61742: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855039.61754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855039.61769: variable 'omit' from source: magic vars 18823 1726855039.62160: variable 'ansible_distribution_major_version' from source: facts 18823 1726855039.62177: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855039.62190: variable 'omit' from source: magic vars 18823 1726855039.62222: variable 'omit' from source: magic vars 18823 1726855039.62268: variable 'omit' from source: magic vars 18823 1726855039.62313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855039.62352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855039.62375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855039.62397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855039.62413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855039.62447: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855039.62459: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855039.62466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855039.62572: Set connection var ansible_timeout to 10 18823 1726855039.62585: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855039.62594: Set connection var ansible_shell_type to sh 18823 1726855039.62605: Set connection var ansible_shell_executable to /bin/sh 18823 1726855039.62616: Set connection var ansible_connection to ssh 18823 1726855039.62624: Set connection var ansible_pipelining to False 18823 1726855039.62653: variable 'ansible_shell_executable' from source: unknown 18823 1726855039.62660: variable 'ansible_connection' from source: unknown 18823 1726855039.62666: variable 'ansible_module_compression' from source: unknown 18823 1726855039.62675: variable 'ansible_shell_type' from source: unknown 18823 1726855039.62684: variable 'ansible_shell_executable' from source: unknown 18823 1726855039.62792: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855039.62795: variable 'ansible_pipelining' from source: unknown 18823 1726855039.62797: variable 'ansible_timeout' from source: unknown 18823 1726855039.62799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855039.62885: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855039.62906: variable 'omit' from source: magic vars 18823 1726855039.62916: starting attempt loop 18823 1726855039.62923: running the handler 18823 1726855039.62941: variable 'ansible_facts' from source: unknown 18823 1726855039.62964: _low_level_execute_command(): starting 18823 1726855039.62976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855039.63717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855039.63732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.63783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855039.63802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.63879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.63907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.64012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.64182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.65909: stdout chunk (state=3): >>>/root <<< 18823 1726855039.66040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.66052: stdout chunk (state=3): >>><<< 18823 1726855039.66072: stderr chunk (state=3): >>><<< 18823 1726855039.66190: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855039.66194: _low_level_execute_command(): starting 18823 1726855039.66200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416 `" && echo ansible-tmp-1726855039.6609807-20343-20424017466416="` echo /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416 `" ) && sleep 0' 18823 1726855039.66734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855039.66754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.66769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855039.66786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855039.66862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.66910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.66928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.66950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.67128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.69050: stdout chunk (state=3): >>>ansible-tmp-1726855039.6609807-20343-20424017466416=/root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416 <<< 18823 1726855039.69290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.69294: stdout chunk (state=3): >>><<< 18823 1726855039.69296: stderr chunk (state=3): >>><<< 18823 1726855039.69298: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855039.6609807-20343-20424017466416=/root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855039.69300: variable 'ansible_module_compression' from source: unknown 18823 1726855039.69600: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855039.69603: variable 'ansible_facts' from source: unknown 18823 1726855039.70017: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/AnsiballZ_setup.py 18823 1726855039.70265: Sending initial data 18823 1726855039.70274: Sent initial data (153 bytes) 18823 1726855039.71548: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855039.71566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855039.71599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.71895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.71898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.73534: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855039.73605: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp0lbbedwu /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/AnsiballZ_setup.py <<< 18823 1726855039.73609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/AnsiballZ_setup.py" <<< 18823 1726855039.73694: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp0lbbedwu" to remote "/root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/AnsiballZ_setup.py" <<< 18823 1726855039.76096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.76232: stderr chunk (state=3): >>><<< 18823 1726855039.76242: stdout chunk (state=3): >>><<< 18823 1726855039.76503: done transferring module to remote 18823 1726855039.76506: _low_level_execute_command(): starting 18823 1726855039.76509: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/ /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/AnsiballZ_setup.py && sleep 0' 18823 1726855039.77658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855039.77807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855039.77863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.77866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.77928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.77993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855039.79740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855039.79781: stderr chunk (state=3): >>><<< 18823 1726855039.80016: stdout chunk (state=3): >>><<< 18823 1726855039.80094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855039.80098: _low_level_execute_command(): starting 18823 1726855039.80106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/AnsiballZ_setup.py && sleep 0' 18823 1726855039.81192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855039.81407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855039.81445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855039.81449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855039.81538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855040.47472: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "20", "epoch": "1726855040", "epoch_int": "1726855040", "date": "2024-09-20", "time": "13:57:20", "iso8601_micro": "2024-09-20T17:57:20.075236Z", "iso8601": "2024-09-20T17:57:20Z", "iso8601_basic": "20240920T135720075236", "iso8601_basic_short": "20240920T135720", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.59912109375, "5m": 0.4296875, "15m": 0.22021484375}, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]",<<< 18823 1726855040.47522: stdout chunk (state=3): >>> "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8"]}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, <<< 18823 1726855040.47565: stdout chunk (state=3): >>>"ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 823, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794537472, "block_size": 4096, "block_total": 65519099, "block_available": 63914682, "block_used": 1604417, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855040.49594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855040.49598: stdout chunk (state=3): >>><<< 18823 1726855040.49600: stderr chunk (state=3): >>><<< 18823 1726855040.49795: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "20", "epoch": "1726855040", "epoch_int": "1726855040", "date": "2024-09-20", "time": "13:57:20", "iso8601_micro": "2024-09-20T17:57:20.075236Z", "iso8601": "2024-09-20T17:57:20Z", "iso8601_basic": "20240920T135720075236", "iso8601_basic_short": "20240920T135720", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.59912109375, "5m": 0.4296875, "15m": 0.22021484375}, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lsr27", "lo", "peerlsr27"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "36:52:b5:86:2d:b8", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3452:b5ff:fe86:2db8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "de:12:63:64:7b:4b", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::dc12:63ff:fe64:7b4b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8", "fe80::dc12:63ff:fe64:7b4b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71", "fe80::3452:b5ff:fe86:2db8"]}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 823, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794537472, "block_size": 4096, "block_total": 65519099, "block_available": 63914682, "block_used": 1604417, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855040.50178: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855040.50212: _low_level_execute_command(): starting 18823 1726855040.50224: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855039.6609807-20343-20424017466416/ > /dev/null 2>&1 && sleep 0' 18823 1726855040.50891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855040.51030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855040.51034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855040.51063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855040.51165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855040.53070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855040.53073: stdout chunk (state=3): >>><<< 18823 1726855040.53076: stderr chunk (state=3): >>><<< 18823 1726855040.53393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855040.53397: handler run complete 18823 1726855040.53399: variable 'ansible_facts' from source: unknown 18823 1726855040.53657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855040.54499: variable 'ansible_facts' from source: unknown 18823 1726855040.54601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855040.54925: attempt loop complete, returning result 18823 1726855040.55033: _execute() done 18823 1726855040.55037: dumping result to json 18823 1726855040.55059: done dumping result, returning 18823 1726855040.55252: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-0000000003f8] 18823 1726855040.55255: sending task result for task 0affcc66-ac2b-d391-077c-0000000003f8 ok: [managed_node2] 18823 1726855040.56806: no more pending results, returning what we have 18823 1726855040.56809: results queue empty 18823 1726855040.56810: checking for any_errors_fatal 18823 1726855040.56811: done checking for any_errors_fatal 18823 1726855040.56812: checking for max_fail_percentage 18823 1726855040.56814: done checking for max_fail_percentage 18823 1726855040.56815: checking to see if all hosts have failed and the running result is not ok 18823 1726855040.56816: done checking to see if all hosts have failed 18823 1726855040.56816: getting the remaining hosts for this loop 18823 1726855040.56818: done getting the remaining hosts for this loop 18823 1726855040.56822: getting the next task for host managed_node2 18823 1726855040.57052: done sending task result for task 0affcc66-ac2b-d391-077c-0000000003f8 18823 1726855040.57056: WORKER PROCESS EXITING 18823 1726855040.57060: done getting next task for host managed_node2 18823 1726855040.57062: ^ task is: TASK: meta (flush_handlers) 18823 1726855040.57064: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855040.57068: getting variables 18823 1726855040.57069: in VariableManager get_vars() 18823 1726855040.57091: Calling all_inventory to load vars for managed_node2 18823 1726855040.57094: Calling groups_inventory to load vars for managed_node2 18823 1726855040.57097: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855040.57108: Calling all_plugins_play to load vars for managed_node2 18823 1726855040.57111: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855040.57113: Calling groups_plugins_play to load vars for managed_node2 18823 1726855040.58802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855040.60428: done with get_vars() 18823 1726855040.60452: done getting variables 18823 1726855040.60532: in VariableManager get_vars() 18823 1726855040.60543: Calling all_inventory to load vars for managed_node2 18823 1726855040.60545: Calling groups_inventory to load vars for managed_node2 18823 1726855040.60547: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855040.60552: Calling all_plugins_play to load vars for managed_node2 18823 1726855040.60555: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855040.60558: Calling groups_plugins_play to load vars for managed_node2 18823 1726855040.61751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855040.63367: done with get_vars() 18823 1726855040.63399: done queuing things up, now waiting for results queue to drain 18823 1726855040.63401: results queue empty 18823 1726855040.63402: checking for any_errors_fatal 18823 1726855040.63409: done checking for any_errors_fatal 18823 1726855040.63410: checking for max_fail_percentage 18823 1726855040.63411: done checking for max_fail_percentage 18823 1726855040.63412: checking to see if all hosts have failed and the running result is not ok 18823 1726855040.63412: done checking to see if all hosts have failed 18823 1726855040.63418: getting the remaining hosts for this loop 18823 1726855040.63419: done getting the remaining hosts for this loop 18823 1726855040.63422: getting the next task for host managed_node2 18823 1726855040.63425: done getting next task for host managed_node2 18823 1726855040.63428: ^ task is: TASK: Include the task 'delete_interface.yml' 18823 1726855040.63430: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855040.63432: getting variables 18823 1726855040.63433: in VariableManager get_vars() 18823 1726855040.63447: Calling all_inventory to load vars for managed_node2 18823 1726855040.63450: Calling groups_inventory to load vars for managed_node2 18823 1726855040.63452: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855040.63457: Calling all_plugins_play to load vars for managed_node2 18823 1726855040.63460: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855040.63463: Calling groups_plugins_play to load vars for managed_node2 18823 1726855040.69078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855040.70643: done with get_vars() 18823 1726855040.70671: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 13:57:20 -0400 (0:00:01.099) 0:00:32.359 ****** 18823 1726855040.70758: entering _queue_task() for managed_node2/include_tasks 18823 1726855040.71213: worker is 1 (out of 1 available) 18823 1726855040.71225: exiting _queue_task() for managed_node2/include_tasks 18823 1726855040.71237: done queuing things up, now waiting for results queue to drain 18823 1726855040.71238: waiting for pending results... 18823 1726855040.71490: running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' 18823 1726855040.71606: in run() - task 0affcc66-ac2b-d391-077c-000000000054 18823 1726855040.71627: variable 'ansible_search_path' from source: unknown 18823 1726855040.71675: calling self._execute() 18823 1726855040.71780: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855040.71797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855040.71814: variable 'omit' from source: magic vars 18823 1726855040.72217: variable 'ansible_distribution_major_version' from source: facts 18823 1726855040.72234: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855040.72246: _execute() done 18823 1726855040.72254: dumping result to json 18823 1726855040.72263: done dumping result, returning 18823 1726855040.72275: done running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' [0affcc66-ac2b-d391-077c-000000000054] 18823 1726855040.72285: sending task result for task 0affcc66-ac2b-d391-077c-000000000054 18823 1726855040.72502: done sending task result for task 0affcc66-ac2b-d391-077c-000000000054 18823 1726855040.72506: WORKER PROCESS EXITING 18823 1726855040.72534: no more pending results, returning what we have 18823 1726855040.72539: in VariableManager get_vars() 18823 1726855040.72570: Calling all_inventory to load vars for managed_node2 18823 1726855040.72572: Calling groups_inventory to load vars for managed_node2 18823 1726855040.72575: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855040.72591: Calling all_plugins_play to load vars for managed_node2 18823 1726855040.72595: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855040.72598: Calling groups_plugins_play to load vars for managed_node2 18823 1726855040.73929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855040.75596: done with get_vars() 18823 1726855040.75614: variable 'ansible_search_path' from source: unknown 18823 1726855040.75629: we have included files to process 18823 1726855040.75630: generating all_blocks data 18823 1726855040.75631: done generating all_blocks data 18823 1726855040.75632: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18823 1726855040.75633: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18823 1726855040.75635: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18823 1726855040.75857: done processing included file 18823 1726855040.75859: iterating over new_blocks loaded from include file 18823 1726855040.75860: in VariableManager get_vars() 18823 1726855040.75873: done with get_vars() 18823 1726855040.75875: filtering new block on tags 18823 1726855040.75893: done filtering new block on tags 18823 1726855040.75896: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 18823 1726855040.75901: extending task lists for all hosts with included blocks 18823 1726855040.75933: done extending task lists 18823 1726855040.75935: done processing included files 18823 1726855040.75935: results queue empty 18823 1726855040.75936: checking for any_errors_fatal 18823 1726855040.75938: done checking for any_errors_fatal 18823 1726855040.75939: checking for max_fail_percentage 18823 1726855040.75940: done checking for max_fail_percentage 18823 1726855040.75940: checking to see if all hosts have failed and the running result is not ok 18823 1726855040.75941: done checking to see if all hosts have failed 18823 1726855040.75942: getting the remaining hosts for this loop 18823 1726855040.75943: done getting the remaining hosts for this loop 18823 1726855040.75946: getting the next task for host managed_node2 18823 1726855040.75949: done getting next task for host managed_node2 18823 1726855040.75951: ^ task is: TASK: Remove test interface if necessary 18823 1726855040.75954: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855040.75956: getting variables 18823 1726855040.75957: in VariableManager get_vars() 18823 1726855040.75965: Calling all_inventory to load vars for managed_node2 18823 1726855040.75967: Calling groups_inventory to load vars for managed_node2 18823 1726855040.75970: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855040.75975: Calling all_plugins_play to load vars for managed_node2 18823 1726855040.75977: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855040.75980: Calling groups_plugins_play to load vars for managed_node2 18823 1726855040.77098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855040.78586: done with get_vars() 18823 1726855040.78609: done getting variables 18823 1726855040.78647: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:57:20 -0400 (0:00:00.079) 0:00:32.438 ****** 18823 1726855040.78674: entering _queue_task() for managed_node2/command 18823 1726855040.79002: worker is 1 (out of 1 available) 18823 1726855040.79014: exiting _queue_task() for managed_node2/command 18823 1726855040.79025: done queuing things up, now waiting for results queue to drain 18823 1726855040.79026: waiting for pending results... 18823 1726855040.79316: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 18823 1726855040.79384: in run() - task 0affcc66-ac2b-d391-077c-000000000409 18823 1726855040.79415: variable 'ansible_search_path' from source: unknown 18823 1726855040.79593: variable 'ansible_search_path' from source: unknown 18823 1726855040.79596: calling self._execute() 18823 1726855040.79599: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855040.79602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855040.79605: variable 'omit' from source: magic vars 18823 1726855040.79985: variable 'ansible_distribution_major_version' from source: facts 18823 1726855040.80005: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855040.80017: variable 'omit' from source: magic vars 18823 1726855040.80069: variable 'omit' from source: magic vars 18823 1726855040.80170: variable 'interface' from source: set_fact 18823 1726855040.80196: variable 'omit' from source: magic vars 18823 1726855040.80242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855040.80286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855040.80317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855040.80340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855040.80357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855040.80398: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855040.80408: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855040.80417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855040.80526: Set connection var ansible_timeout to 10 18823 1726855040.80536: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855040.80542: Set connection var ansible_shell_type to sh 18823 1726855040.80549: Set connection var ansible_shell_executable to /bin/sh 18823 1726855040.80556: Set connection var ansible_connection to ssh 18823 1726855040.80563: Set connection var ansible_pipelining to False 18823 1726855040.80700: variable 'ansible_shell_executable' from source: unknown 18823 1726855040.80704: variable 'ansible_connection' from source: unknown 18823 1726855040.80707: variable 'ansible_module_compression' from source: unknown 18823 1726855040.80709: variable 'ansible_shell_type' from source: unknown 18823 1726855040.80711: variable 'ansible_shell_executable' from source: unknown 18823 1726855040.80714: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855040.80716: variable 'ansible_pipelining' from source: unknown 18823 1726855040.80718: variable 'ansible_timeout' from source: unknown 18823 1726855040.80720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855040.80764: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855040.80781: variable 'omit' from source: magic vars 18823 1726855040.80793: starting attempt loop 18823 1726855040.80800: running the handler 18823 1726855040.80826: _low_level_execute_command(): starting 18823 1726855040.80838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855040.81584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855040.81607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855040.81689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855040.81717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855040.81828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855040.83597: stdout chunk (state=3): >>>/root <<< 18823 1726855040.83728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855040.83758: stderr chunk (state=3): >>><<< 18823 1726855040.83761: stdout chunk (state=3): >>><<< 18823 1726855040.83871: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855040.83875: _low_level_execute_command(): starting 18823 1726855040.83878: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156 `" && echo ansible-tmp-1726855040.8378587-20394-111471571450156="` echo /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156 `" ) && sleep 0' 18823 1726855040.84468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855040.84484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855040.84503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855040.84521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855040.84559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855040.84590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855040.84667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855040.84683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855040.84709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855040.84817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855040.86771: stdout chunk (state=3): >>>ansible-tmp-1726855040.8378587-20394-111471571450156=/root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156 <<< 18823 1726855040.86941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855040.86945: stdout chunk (state=3): >>><<< 18823 1726855040.86947: stderr chunk (state=3): >>><<< 18823 1726855040.87094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855040.8378587-20394-111471571450156=/root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855040.87098: variable 'ansible_module_compression' from source: unknown 18823 1726855040.87101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855040.87123: variable 'ansible_facts' from source: unknown 18823 1726855040.87218: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/AnsiballZ_command.py 18823 1726855040.87461: Sending initial data 18823 1726855040.87463: Sent initial data (156 bytes) 18823 1726855040.88005: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855040.88101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855040.88132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855040.88151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855040.88172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855040.88279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855040.89892: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855040.89994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855040.90050: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp8uiovhs3 /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/AnsiballZ_command.py <<< 18823 1726855040.90069: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/AnsiballZ_command.py" <<< 18823 1726855040.90156: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18823 1726855040.90181: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp8uiovhs3" to remote "/root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/AnsiballZ_command.py" <<< 18823 1726855040.91049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855040.91228: stderr chunk (state=3): >>><<< 18823 1726855040.91231: stdout chunk (state=3): >>><<< 18823 1726855040.91233: done transferring module to remote 18823 1726855040.91235: _low_level_execute_command(): starting 18823 1726855040.91237: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/ /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/AnsiballZ_command.py && sleep 0' 18823 1726855040.91800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855040.91833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855040.91849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855040.91944: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855040.91960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855040.91978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855040.92067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855040.93892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855040.93895: stdout chunk (state=3): >>><<< 18823 1726855040.93903: stderr chunk (state=3): >>><<< 18823 1726855040.93921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855040.93925: _low_level_execute_command(): starting 18823 1726855040.93931: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/AnsiballZ_command.py && sleep 0' 18823 1726855040.94590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855040.94594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855040.94596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855040.94599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855040.94615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855040.94642: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855040.94646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855040.94654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855040.94742: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855040.94789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855040.94866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855041.10919: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 13:57:21.095470", "end": "2024-09-20 13:57:21.103965", "delta": "0:00:00.008495", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855041.13109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855041.13113: stdout chunk (state=3): >>><<< 18823 1726855041.13116: stderr chunk (state=3): >>><<< 18823 1726855041.13118: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 13:57:21.095470", "end": "2024-09-20 13:57:21.103965", "delta": "0:00:00.008495", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855041.13121: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855041.13123: _low_level_execute_command(): starting 18823 1726855041.13125: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855040.8378587-20394-111471571450156/ > /dev/null 2>&1 && sleep 0' 18823 1726855041.13739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855041.13809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855041.13857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855041.13876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855041.13975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855041.15859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855041.15909: stderr chunk (state=3): >>><<< 18823 1726855041.15912: stdout chunk (state=3): >>><<< 18823 1726855041.15927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855041.15933: handler run complete 18823 1726855041.15956: Evaluated conditional (False): False 18823 1726855041.15965: attempt loop complete, returning result 18823 1726855041.15968: _execute() done 18823 1726855041.15971: dumping result to json 18823 1726855041.15977: done dumping result, returning 18823 1726855041.15986: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0affcc66-ac2b-d391-077c-000000000409] 18823 1726855041.15992: sending task result for task 0affcc66-ac2b-d391-077c-000000000409 ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.008495", "end": "2024-09-20 13:57:21.103965", "rc": 0, "start": "2024-09-20 13:57:21.095470" } 18823 1726855041.16201: no more pending results, returning what we have 18823 1726855041.16206: results queue empty 18823 1726855041.16208: checking for any_errors_fatal 18823 1726855041.16209: done checking for any_errors_fatal 18823 1726855041.16210: checking for max_fail_percentage 18823 1726855041.16211: done checking for max_fail_percentage 18823 1726855041.16212: checking to see if all hosts have failed and the running result is not ok 18823 1726855041.16213: done checking to see if all hosts have failed 18823 1726855041.16213: getting the remaining hosts for this loop 18823 1726855041.16215: done getting the remaining hosts for this loop 18823 1726855041.16218: getting the next task for host managed_node2 18823 1726855041.16225: done getting next task for host managed_node2 18823 1726855041.16227: ^ task is: TASK: meta (flush_handlers) 18823 1726855041.16229: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855041.16232: getting variables 18823 1726855041.16234: in VariableManager get_vars() 18823 1726855041.16309: Calling all_inventory to load vars for managed_node2 18823 1726855041.16312: Calling groups_inventory to load vars for managed_node2 18823 1726855041.16315: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855041.16325: Calling all_plugins_play to load vars for managed_node2 18823 1726855041.16328: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855041.16330: Calling groups_plugins_play to load vars for managed_node2 18823 1726855041.16862: done sending task result for task 0affcc66-ac2b-d391-077c-000000000409 18823 1726855041.16865: WORKER PROCESS EXITING 18823 1726855041.17905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855041.19541: done with get_vars() 18823 1726855041.19562: done getting variables 18823 1726855041.19634: in VariableManager get_vars() 18823 1726855041.19645: Calling all_inventory to load vars for managed_node2 18823 1726855041.19647: Calling groups_inventory to load vars for managed_node2 18823 1726855041.19649: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855041.19653: Calling all_plugins_play to load vars for managed_node2 18823 1726855041.19655: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855041.19657: Calling groups_plugins_play to load vars for managed_node2 18823 1726855041.20840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855041.22460: done with get_vars() 18823 1726855041.22493: done queuing things up, now waiting for results queue to drain 18823 1726855041.22495: results queue empty 18823 1726855041.22496: checking for any_errors_fatal 18823 1726855041.22499: done checking for any_errors_fatal 18823 1726855041.22500: checking for max_fail_percentage 18823 1726855041.22501: done checking for max_fail_percentage 18823 1726855041.22502: checking to see if all hosts have failed and the running result is not ok 18823 1726855041.22506: done checking to see if all hosts have failed 18823 1726855041.22510: getting the remaining hosts for this loop 18823 1726855041.22511: done getting the remaining hosts for this loop 18823 1726855041.22514: getting the next task for host managed_node2 18823 1726855041.22518: done getting next task for host managed_node2 18823 1726855041.22520: ^ task is: TASK: meta (flush_handlers) 18823 1726855041.22521: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855041.22524: getting variables 18823 1726855041.22526: in VariableManager get_vars() 18823 1726855041.22536: Calling all_inventory to load vars for managed_node2 18823 1726855041.22538: Calling groups_inventory to load vars for managed_node2 18823 1726855041.22541: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855041.22546: Calling all_plugins_play to load vars for managed_node2 18823 1726855041.22548: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855041.22551: Calling groups_plugins_play to load vars for managed_node2 18823 1726855041.23829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855041.25442: done with get_vars() 18823 1726855041.25466: done getting variables 18823 1726855041.25526: in VariableManager get_vars() 18823 1726855041.25536: Calling all_inventory to load vars for managed_node2 18823 1726855041.25538: Calling groups_inventory to load vars for managed_node2 18823 1726855041.25541: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855041.25545: Calling all_plugins_play to load vars for managed_node2 18823 1726855041.25553: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855041.25557: Calling groups_plugins_play to load vars for managed_node2 18823 1726855041.26729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855041.28399: done with get_vars() 18823 1726855041.28432: done queuing things up, now waiting for results queue to drain 18823 1726855041.28434: results queue empty 18823 1726855041.28435: checking for any_errors_fatal 18823 1726855041.28437: done checking for any_errors_fatal 18823 1726855041.28437: checking for max_fail_percentage 18823 1726855041.28439: done checking for max_fail_percentage 18823 1726855041.28439: checking to see if all hosts have failed and the running result is not ok 18823 1726855041.28440: done checking to see if all hosts have failed 18823 1726855041.28441: getting the remaining hosts for this loop 18823 1726855041.28442: done getting the remaining hosts for this loop 18823 1726855041.28445: getting the next task for host managed_node2 18823 1726855041.28448: done getting next task for host managed_node2 18823 1726855041.28449: ^ task is: None 18823 1726855041.28451: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855041.28452: done queuing things up, now waiting for results queue to drain 18823 1726855041.28453: results queue empty 18823 1726855041.28454: checking for any_errors_fatal 18823 1726855041.28454: done checking for any_errors_fatal 18823 1726855041.28455: checking for max_fail_percentage 18823 1726855041.28456: done checking for max_fail_percentage 18823 1726855041.28456: checking to see if all hosts have failed and the running result is not ok 18823 1726855041.28457: done checking to see if all hosts have failed 18823 1726855041.28458: getting the next task for host managed_node2 18823 1726855041.28461: done getting next task for host managed_node2 18823 1726855041.28461: ^ task is: None 18823 1726855041.28462: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855041.28516: in VariableManager get_vars() 18823 1726855041.28540: done with get_vars() 18823 1726855041.28547: in VariableManager get_vars() 18823 1726855041.28561: done with get_vars() 18823 1726855041.28566: variable 'omit' from source: magic vars 18823 1726855041.28694: variable 'profile' from source: play vars 18823 1726855041.28806: in VariableManager get_vars() 18823 1726855041.28826: done with get_vars() 18823 1726855041.28848: variable 'omit' from source: magic vars 18823 1726855041.28916: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 18823 1726855041.29630: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855041.29654: getting the remaining hosts for this loop 18823 1726855041.29656: done getting the remaining hosts for this loop 18823 1726855041.29658: getting the next task for host managed_node2 18823 1726855041.29661: done getting next task for host managed_node2 18823 1726855041.29663: ^ task is: TASK: Gathering Facts 18823 1726855041.29665: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855041.29667: getting variables 18823 1726855041.29668: in VariableManager get_vars() 18823 1726855041.29679: Calling all_inventory to load vars for managed_node2 18823 1726855041.29681: Calling groups_inventory to load vars for managed_node2 18823 1726855041.29683: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855041.29691: Calling all_plugins_play to load vars for managed_node2 18823 1726855041.29694: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855041.29701: Calling groups_plugins_play to load vars for managed_node2 18823 1726855041.31052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855041.32635: done with get_vars() 18823 1726855041.32657: done getting variables 18823 1726855041.32713: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 13:57:21 -0400 (0:00:00.540) 0:00:32.979 ****** 18823 1726855041.32740: entering _queue_task() for managed_node2/gather_facts 18823 1726855041.33097: worker is 1 (out of 1 available) 18823 1726855041.33112: exiting _queue_task() for managed_node2/gather_facts 18823 1726855041.33126: done queuing things up, now waiting for results queue to drain 18823 1726855041.33127: waiting for pending results... 18823 1726855041.33520: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855041.33792: in run() - task 0affcc66-ac2b-d391-077c-000000000417 18823 1726855041.33798: variable 'ansible_search_path' from source: unknown 18823 1726855041.33802: calling self._execute() 18823 1726855041.33807: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855041.33810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855041.33813: variable 'omit' from source: magic vars 18823 1726855041.34226: variable 'ansible_distribution_major_version' from source: facts 18823 1726855041.34244: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855041.34264: variable 'omit' from source: magic vars 18823 1726855041.34299: variable 'omit' from source: magic vars 18823 1726855041.34343: variable 'omit' from source: magic vars 18823 1726855041.34397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855041.34441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855041.34474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855041.34585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855041.34591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855041.34594: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855041.34596: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855041.34598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855041.34681: Set connection var ansible_timeout to 10 18823 1726855041.34702: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855041.34794: Set connection var ansible_shell_type to sh 18823 1726855041.34808: Set connection var ansible_shell_executable to /bin/sh 18823 1726855041.34812: Set connection var ansible_connection to ssh 18823 1726855041.34814: Set connection var ansible_pipelining to False 18823 1726855041.34816: variable 'ansible_shell_executable' from source: unknown 18823 1726855041.34818: variable 'ansible_connection' from source: unknown 18823 1726855041.34820: variable 'ansible_module_compression' from source: unknown 18823 1726855041.34822: variable 'ansible_shell_type' from source: unknown 18823 1726855041.34825: variable 'ansible_shell_executable' from source: unknown 18823 1726855041.34827: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855041.34829: variable 'ansible_pipelining' from source: unknown 18823 1726855041.34831: variable 'ansible_timeout' from source: unknown 18823 1726855041.34833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855041.35007: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855041.35032: variable 'omit' from source: magic vars 18823 1726855041.35044: starting attempt loop 18823 1726855041.35051: running the handler 18823 1726855041.35072: variable 'ansible_facts' from source: unknown 18823 1726855041.35101: _low_level_execute_command(): starting 18823 1726855041.35118: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855041.35840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855041.35855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855041.35867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855041.35907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855041.36017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855041.36032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855041.36050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855041.36156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855041.37880: stdout chunk (state=3): >>>/root <<< 18823 1726855041.38004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855041.38046: stderr chunk (state=3): >>><<< 18823 1726855041.38049: stdout chunk (state=3): >>><<< 18823 1726855041.38071: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855041.38174: _low_level_execute_command(): starting 18823 1726855041.38177: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210 `" && echo ansible-tmp-1726855041.380785-20416-105082641342210="` echo /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210 `" ) && sleep 0' 18823 1726855041.38797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855041.38814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855041.38836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855041.38854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855041.38903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855041.38990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855041.39045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855041.39110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855041.41035: stdout chunk (state=3): >>>ansible-tmp-1726855041.380785-20416-105082641342210=/root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210 <<< 18823 1726855041.41178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855041.41194: stderr chunk (state=3): >>><<< 18823 1726855041.41203: stdout chunk (state=3): >>><<< 18823 1726855041.41232: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855041.380785-20416-105082641342210=/root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855041.41393: variable 'ansible_module_compression' from source: unknown 18823 1726855041.41396: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855041.41399: variable 'ansible_facts' from source: unknown 18823 1726855041.41608: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/AnsiballZ_setup.py 18823 1726855041.41810: Sending initial data 18823 1726855041.41820: Sent initial data (153 bytes) 18823 1726855041.42417: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855041.42434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855041.42450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855041.42509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855041.42569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855041.42590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855041.42622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855041.42724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855041.44320: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855041.44386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855041.44486: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp0zi2a8cs /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/AnsiballZ_setup.py <<< 18823 1726855041.44492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/AnsiballZ_setup.py" <<< 18823 1726855041.44547: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp0zi2a8cs" to remote "/root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/AnsiballZ_setup.py" <<< 18823 1726855041.46163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855041.46178: stderr chunk (state=3): >>><<< 18823 1726855041.46267: stdout chunk (state=3): >>><<< 18823 1726855041.46270: done transferring module to remote 18823 1726855041.46273: _low_level_execute_command(): starting 18823 1726855041.46275: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/ /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/AnsiballZ_setup.py && sleep 0' 18823 1726855041.46876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855041.46896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855041.46914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855041.46932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855041.46959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855041.47062: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855041.47083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855041.47197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855041.48998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855041.49002: stdout chunk (state=3): >>><<< 18823 1726855041.49004: stderr chunk (state=3): >>><<< 18823 1726855041.49023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855041.49033: _low_level_execute_command(): starting 18823 1726855041.49116: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/AnsiballZ_setup.py && sleep 0' 18823 1726855041.49654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855041.49702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855041.49778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855041.49805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855041.49842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855041.49927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855042.13001: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.59912109375, "5m": 0.4296875, "15m": 0.22021484375}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "21", "epoch": "1726855041", "epoch_int": "1726855041", "date": "2024-09-20", "time": "13:57:21", "iso8601_micro": "2024-09-20T17:57:21.770853Z", "iso8601": "2024-09-20T17:57:21Z", "iso8601_basic": "20240920T135721770853", "iso8601_basic_short": "20240920T135721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2942, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 589, "free": 2942}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 825, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794537472, "block_size": 4096, "block_total": 65519099, "block_available": 63914682, "block_used": 1604417, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855042.14995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855042.14999: stdout chunk (state=3): >>><<< 18823 1726855042.15009: stderr chunk (state=3): >>><<< 18823 1726855042.15013: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.59912109375, "5m": 0.4296875, "15m": 0.22021484375}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "21", "epoch": "1726855041", "epoch_int": "1726855041", "date": "2024-09-20", "time": "13:57:21", "iso8601_micro": "2024-09-20T17:57:21.770853Z", "iso8601": "2024-09-20T17:57:21Z", "iso8601_basic": "20240920T135721770853", "iso8601_basic_short": "20240920T135721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2942, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 589, "free": 2942}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 825, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794537472, "block_size": 4096, "block_total": 65519099, "block_available": 63914682, "block_used": 1604417, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855042.15621: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855042.15650: _low_level_execute_command(): starting 18823 1726855042.15701: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855041.380785-20416-105082641342210/ > /dev/null 2>&1 && sleep 0' 18823 1726855042.16632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855042.16651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855042.16667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855042.16705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855042.16720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855042.16757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855042.16823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855042.16848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855042.16867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855042.16971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855042.19194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855042.19199: stdout chunk (state=3): >>><<< 18823 1726855042.19201: stderr chunk (state=3): >>><<< 18823 1726855042.19207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855042.19210: handler run complete 18823 1726855042.19212: variable 'ansible_facts' from source: unknown 18823 1726855042.19467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.20039: variable 'ansible_facts' from source: unknown 18823 1726855042.20134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.20280: attempt loop complete, returning result 18823 1726855042.20292: _execute() done 18823 1726855042.20301: dumping result to json 18823 1726855042.20337: done dumping result, returning 18823 1726855042.20350: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-000000000417] 18823 1726855042.20362: sending task result for task 0affcc66-ac2b-d391-077c-000000000417 ok: [managed_node2] 18823 1726855042.21416: no more pending results, returning what we have 18823 1726855042.21466: results queue empty 18823 1726855042.21468: checking for any_errors_fatal 18823 1726855042.21469: done checking for any_errors_fatal 18823 1726855042.21470: checking for max_fail_percentage 18823 1726855042.21472: done checking for max_fail_percentage 18823 1726855042.21472: checking to see if all hosts have failed and the running result is not ok 18823 1726855042.21473: done checking to see if all hosts have failed 18823 1726855042.21474: getting the remaining hosts for this loop 18823 1726855042.21475: done getting the remaining hosts for this loop 18823 1726855042.21479: getting the next task for host managed_node2 18823 1726855042.21525: done getting next task for host managed_node2 18823 1726855042.21532: ^ task is: TASK: meta (flush_handlers) 18823 1726855042.21534: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855042.21590: getting variables 18823 1726855042.21592: in VariableManager get_vars() 18823 1726855042.21658: done sending task result for task 0affcc66-ac2b-d391-077c-000000000417 18823 1726855042.21661: WORKER PROCESS EXITING 18823 1726855042.21679: Calling all_inventory to load vars for managed_node2 18823 1726855042.21682: Calling groups_inventory to load vars for managed_node2 18823 1726855042.21684: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.21757: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.21761: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.21765: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.23737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.25495: done with get_vars() 18823 1726855042.25534: done getting variables 18823 1726855042.25642: in VariableManager get_vars() 18823 1726855042.25668: Calling all_inventory to load vars for managed_node2 18823 1726855042.25670: Calling groups_inventory to load vars for managed_node2 18823 1726855042.25673: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.25678: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.25680: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.25683: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.26372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.27953: done with get_vars() 18823 1726855042.27979: done queuing things up, now waiting for results queue to drain 18823 1726855042.27982: results queue empty 18823 1726855042.27982: checking for any_errors_fatal 18823 1726855042.27986: done checking for any_errors_fatal 18823 1726855042.27997: checking for max_fail_percentage 18823 1726855042.27999: done checking for max_fail_percentage 18823 1726855042.27999: checking to see if all hosts have failed and the running result is not ok 18823 1726855042.28000: done checking to see if all hosts have failed 18823 1726855042.28001: getting the remaining hosts for this loop 18823 1726855042.28002: done getting the remaining hosts for this loop 18823 1726855042.28005: getting the next task for host managed_node2 18823 1726855042.28009: done getting next task for host managed_node2 18823 1726855042.28013: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18823 1726855042.28014: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855042.28024: getting variables 18823 1726855042.28025: in VariableManager get_vars() 18823 1726855042.28039: Calling all_inventory to load vars for managed_node2 18823 1726855042.28041: Calling groups_inventory to load vars for managed_node2 18823 1726855042.28043: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.28048: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.28050: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.28053: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.29250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.30891: done with get_vars() 18823 1726855042.30921: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:57:22 -0400 (0:00:00.982) 0:00:33.961 ****** 18823 1726855042.31000: entering _queue_task() for managed_node2/include_tasks 18823 1726855042.31371: worker is 1 (out of 1 available) 18823 1726855042.31385: exiting _queue_task() for managed_node2/include_tasks 18823 1726855042.31405: done queuing things up, now waiting for results queue to drain 18823 1726855042.31406: waiting for pending results... 18823 1726855042.31638: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18823 1726855042.31763: in run() - task 0affcc66-ac2b-d391-077c-00000000005c 18823 1726855042.31795: variable 'ansible_search_path' from source: unknown 18823 1726855042.31814: variable 'ansible_search_path' from source: unknown 18823 1726855042.31993: calling self._execute() 18823 1726855042.31997: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855042.32000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855042.32003: variable 'omit' from source: magic vars 18823 1726855042.32371: variable 'ansible_distribution_major_version' from source: facts 18823 1726855042.32390: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855042.32400: _execute() done 18823 1726855042.32414: dumping result to json 18823 1726855042.32422: done dumping result, returning 18823 1726855042.32434: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-d391-077c-00000000005c] 18823 1726855042.32450: sending task result for task 0affcc66-ac2b-d391-077c-00000000005c 18823 1726855042.32698: done sending task result for task 0affcc66-ac2b-d391-077c-00000000005c 18823 1726855042.32701: WORKER PROCESS EXITING 18823 1726855042.32746: no more pending results, returning what we have 18823 1726855042.32751: in VariableManager get_vars() 18823 1726855042.32802: Calling all_inventory to load vars for managed_node2 18823 1726855042.32807: Calling groups_inventory to load vars for managed_node2 18823 1726855042.32811: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.32825: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.32829: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.32832: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.34543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.36243: done with get_vars() 18823 1726855042.36271: variable 'ansible_search_path' from source: unknown 18823 1726855042.36272: variable 'ansible_search_path' from source: unknown 18823 1726855042.36308: we have included files to process 18823 1726855042.36309: generating all_blocks data 18823 1726855042.36311: done generating all_blocks data 18823 1726855042.36312: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855042.36313: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855042.36315: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18823 1726855042.36912: done processing included file 18823 1726855042.36918: iterating over new_blocks loaded from include file 18823 1726855042.36920: in VariableManager get_vars() 18823 1726855042.36941: done with get_vars() 18823 1726855042.36943: filtering new block on tags 18823 1726855042.36959: done filtering new block on tags 18823 1726855042.36962: in VariableManager get_vars() 18823 1726855042.36982: done with get_vars() 18823 1726855042.36983: filtering new block on tags 18823 1726855042.37007: done filtering new block on tags 18823 1726855042.37010: in VariableManager get_vars() 18823 1726855042.37035: done with get_vars() 18823 1726855042.37037: filtering new block on tags 18823 1726855042.37052: done filtering new block on tags 18823 1726855042.37055: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 18823 1726855042.37059: extending task lists for all hosts with included blocks 18823 1726855042.37472: done extending task lists 18823 1726855042.37473: done processing included files 18823 1726855042.37474: results queue empty 18823 1726855042.37475: checking for any_errors_fatal 18823 1726855042.37476: done checking for any_errors_fatal 18823 1726855042.37477: checking for max_fail_percentage 18823 1726855042.37478: done checking for max_fail_percentage 18823 1726855042.37478: checking to see if all hosts have failed and the running result is not ok 18823 1726855042.37479: done checking to see if all hosts have failed 18823 1726855042.37480: getting the remaining hosts for this loop 18823 1726855042.37481: done getting the remaining hosts for this loop 18823 1726855042.37484: getting the next task for host managed_node2 18823 1726855042.37489: done getting next task for host managed_node2 18823 1726855042.37492: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18823 1726855042.37495: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855042.37505: getting variables 18823 1726855042.37507: in VariableManager get_vars() 18823 1726855042.37521: Calling all_inventory to load vars for managed_node2 18823 1726855042.37524: Calling groups_inventory to load vars for managed_node2 18823 1726855042.37526: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.37531: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.37533: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.37536: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.38799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.40462: done with get_vars() 18823 1726855042.40486: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:57:22 -0400 (0:00:00.095) 0:00:34.057 ****** 18823 1726855042.40572: entering _queue_task() for managed_node2/setup 18823 1726855042.41121: worker is 1 (out of 1 available) 18823 1726855042.41130: exiting _queue_task() for managed_node2/setup 18823 1726855042.41141: done queuing things up, now waiting for results queue to drain 18823 1726855042.41142: waiting for pending results... 18823 1726855042.41371: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18823 1726855042.41422: in run() - task 0affcc66-ac2b-d391-077c-000000000458 18823 1726855042.41441: variable 'ansible_search_path' from source: unknown 18823 1726855042.41449: variable 'ansible_search_path' from source: unknown 18823 1726855042.41499: calling self._execute() 18823 1726855042.41599: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855042.41614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855042.41630: variable 'omit' from source: magic vars 18823 1726855042.42024: variable 'ansible_distribution_major_version' from source: facts 18823 1726855042.42042: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855042.42348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855042.44749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855042.44830: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855042.44879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855042.44958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855042.44961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855042.45041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855042.45082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855042.45117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855042.45156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855042.45180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855042.45284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855042.45289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855042.45297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855042.45342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855042.45360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855042.45534: variable '__network_required_facts' from source: role '' defaults 18823 1726855042.45550: variable 'ansible_facts' from source: unknown 18823 1726855042.46343: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18823 1726855042.46373: when evaluation is False, skipping this task 18823 1726855042.46377: _execute() done 18823 1726855042.46379: dumping result to json 18823 1726855042.46381: done dumping result, returning 18823 1726855042.46391: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-d391-077c-000000000458] 18823 1726855042.46479: sending task result for task 0affcc66-ac2b-d391-077c-000000000458 18823 1726855042.46554: done sending task result for task 0affcc66-ac2b-d391-077c-000000000458 18823 1726855042.46557: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855042.46608: no more pending results, returning what we have 18823 1726855042.46613: results queue empty 18823 1726855042.46614: checking for any_errors_fatal 18823 1726855042.46616: done checking for any_errors_fatal 18823 1726855042.46617: checking for max_fail_percentage 18823 1726855042.46618: done checking for max_fail_percentage 18823 1726855042.46619: checking to see if all hosts have failed and the running result is not ok 18823 1726855042.46620: done checking to see if all hosts have failed 18823 1726855042.46620: getting the remaining hosts for this loop 18823 1726855042.46622: done getting the remaining hosts for this loop 18823 1726855042.46626: getting the next task for host managed_node2 18823 1726855042.46636: done getting next task for host managed_node2 18823 1726855042.46640: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18823 1726855042.46643: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855042.46657: getting variables 18823 1726855042.46659: in VariableManager get_vars() 18823 1726855042.46702: Calling all_inventory to load vars for managed_node2 18823 1726855042.46708: Calling groups_inventory to load vars for managed_node2 18823 1726855042.46711: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.46723: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.46727: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.46730: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.48475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.50117: done with get_vars() 18823 1726855042.50144: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:57:22 -0400 (0:00:00.096) 0:00:34.154 ****** 18823 1726855042.50247: entering _queue_task() for managed_node2/stat 18823 1726855042.50611: worker is 1 (out of 1 available) 18823 1726855042.50624: exiting _queue_task() for managed_node2/stat 18823 1726855042.50637: done queuing things up, now waiting for results queue to drain 18823 1726855042.50638: waiting for pending results... 18823 1726855042.51017: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 18823 1726855042.51069: in run() - task 0affcc66-ac2b-d391-077c-00000000045a 18823 1726855042.51113: variable 'ansible_search_path' from source: unknown 18823 1726855042.51117: variable 'ansible_search_path' from source: unknown 18823 1726855042.51147: calling self._execute() 18823 1726855042.51293: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855042.51297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855042.51300: variable 'omit' from source: magic vars 18823 1726855042.51661: variable 'ansible_distribution_major_version' from source: facts 18823 1726855042.51679: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855042.51853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855042.52144: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855042.52195: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855042.52237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855042.52302: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855042.52369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855042.52402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855042.52494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855042.52498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855042.52562: variable '__network_is_ostree' from source: set_fact 18823 1726855042.52575: Evaluated conditional (not __network_is_ostree is defined): False 18823 1726855042.52583: when evaluation is False, skipping this task 18823 1726855042.52594: _execute() done 18823 1726855042.52605: dumping result to json 18823 1726855042.52693: done dumping result, returning 18823 1726855042.52697: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-d391-077c-00000000045a] 18823 1726855042.52700: sending task result for task 0affcc66-ac2b-d391-077c-00000000045a 18823 1726855042.52767: done sending task result for task 0affcc66-ac2b-d391-077c-00000000045a 18823 1726855042.52770: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18823 1726855042.52828: no more pending results, returning what we have 18823 1726855042.52831: results queue empty 18823 1726855042.52833: checking for any_errors_fatal 18823 1726855042.52839: done checking for any_errors_fatal 18823 1726855042.52840: checking for max_fail_percentage 18823 1726855042.52842: done checking for max_fail_percentage 18823 1726855042.52842: checking to see if all hosts have failed and the running result is not ok 18823 1726855042.52843: done checking to see if all hosts have failed 18823 1726855042.52844: getting the remaining hosts for this loop 18823 1726855042.52846: done getting the remaining hosts for this loop 18823 1726855042.52850: getting the next task for host managed_node2 18823 1726855042.52856: done getting next task for host managed_node2 18823 1726855042.52860: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18823 1726855042.52863: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855042.52877: getting variables 18823 1726855042.52878: in VariableManager get_vars() 18823 1726855042.52921: Calling all_inventory to load vars for managed_node2 18823 1726855042.52925: Calling groups_inventory to load vars for managed_node2 18823 1726855042.52927: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.52939: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.52943: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.52947: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.54554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.56301: done with get_vars() 18823 1726855042.56325: done getting variables 18823 1726855042.56399: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:57:22 -0400 (0:00:00.061) 0:00:34.216 ****** 18823 1726855042.56436: entering _queue_task() for managed_node2/set_fact 18823 1726855042.56818: worker is 1 (out of 1 available) 18823 1726855042.56832: exiting _queue_task() for managed_node2/set_fact 18823 1726855042.56844: done queuing things up, now waiting for results queue to drain 18823 1726855042.56845: waiting for pending results... 18823 1726855042.57218: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18823 1726855042.57256: in run() - task 0affcc66-ac2b-d391-077c-00000000045b 18823 1726855042.57295: variable 'ansible_search_path' from source: unknown 18823 1726855042.57311: variable 'ansible_search_path' from source: unknown 18823 1726855042.57422: calling self._execute() 18823 1726855042.57459: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855042.57469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855042.57508: variable 'omit' from source: magic vars 18823 1726855042.57777: variable 'ansible_distribution_major_version' from source: facts 18823 1726855042.57789: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855042.57914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855042.58105: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855042.58141: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855042.58166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855042.58193: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855042.58260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855042.58276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855042.58298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855042.58318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855042.58380: variable '__network_is_ostree' from source: set_fact 18823 1726855042.58386: Evaluated conditional (not __network_is_ostree is defined): False 18823 1726855042.58391: when evaluation is False, skipping this task 18823 1726855042.58394: _execute() done 18823 1726855042.58396: dumping result to json 18823 1726855042.58401: done dumping result, returning 18823 1726855042.58411: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-d391-077c-00000000045b] 18823 1726855042.58416: sending task result for task 0affcc66-ac2b-d391-077c-00000000045b 18823 1726855042.58497: done sending task result for task 0affcc66-ac2b-d391-077c-00000000045b 18823 1726855042.58501: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18823 1726855042.58551: no more pending results, returning what we have 18823 1726855042.58554: results queue empty 18823 1726855042.58555: checking for any_errors_fatal 18823 1726855042.58562: done checking for any_errors_fatal 18823 1726855042.58562: checking for max_fail_percentage 18823 1726855042.58564: done checking for max_fail_percentage 18823 1726855042.58564: checking to see if all hosts have failed and the running result is not ok 18823 1726855042.58565: done checking to see if all hosts have failed 18823 1726855042.58566: getting the remaining hosts for this loop 18823 1726855042.58567: done getting the remaining hosts for this loop 18823 1726855042.58571: getting the next task for host managed_node2 18823 1726855042.58580: done getting next task for host managed_node2 18823 1726855042.58583: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18823 1726855042.58585: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855042.58600: getting variables 18823 1726855042.58602: in VariableManager get_vars() 18823 1726855042.58639: Calling all_inventory to load vars for managed_node2 18823 1726855042.58644: Calling groups_inventory to load vars for managed_node2 18823 1726855042.58646: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855042.58655: Calling all_plugins_play to load vars for managed_node2 18823 1726855042.58657: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855042.58660: Calling groups_plugins_play to load vars for managed_node2 18823 1726855042.59676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855042.60980: done with get_vars() 18823 1726855042.61004: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:57:22 -0400 (0:00:00.046) 0:00:34.262 ****** 18823 1726855042.61078: entering _queue_task() for managed_node2/service_facts 18823 1726855042.61330: worker is 1 (out of 1 available) 18823 1726855042.61345: exiting _queue_task() for managed_node2/service_facts 18823 1726855042.61358: done queuing things up, now waiting for results queue to drain 18823 1726855042.61359: waiting for pending results... 18823 1726855042.61538: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 18823 1726855042.61634: in run() - task 0affcc66-ac2b-d391-077c-00000000045d 18823 1726855042.61645: variable 'ansible_search_path' from source: unknown 18823 1726855042.61648: variable 'ansible_search_path' from source: unknown 18823 1726855042.61676: calling self._execute() 18823 1726855042.61749: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855042.61753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855042.61762: variable 'omit' from source: magic vars 18823 1726855042.62034: variable 'ansible_distribution_major_version' from source: facts 18823 1726855042.62043: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855042.62049: variable 'omit' from source: magic vars 18823 1726855042.62081: variable 'omit' from source: magic vars 18823 1726855042.62108: variable 'omit' from source: magic vars 18823 1726855042.62143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855042.62170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855042.62186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855042.62202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855042.62214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855042.62241: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855042.62244: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855042.62246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855042.62317: Set connection var ansible_timeout to 10 18823 1726855042.62322: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855042.62325: Set connection var ansible_shell_type to sh 18823 1726855042.62330: Set connection var ansible_shell_executable to /bin/sh 18823 1726855042.62335: Set connection var ansible_connection to ssh 18823 1726855042.62341: Set connection var ansible_pipelining to False 18823 1726855042.62362: variable 'ansible_shell_executable' from source: unknown 18823 1726855042.62365: variable 'ansible_connection' from source: unknown 18823 1726855042.62368: variable 'ansible_module_compression' from source: unknown 18823 1726855042.62370: variable 'ansible_shell_type' from source: unknown 18823 1726855042.62372: variable 'ansible_shell_executable' from source: unknown 18823 1726855042.62374: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855042.62376: variable 'ansible_pipelining' from source: unknown 18823 1726855042.62379: variable 'ansible_timeout' from source: unknown 18823 1726855042.62383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855042.62531: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855042.62539: variable 'omit' from source: magic vars 18823 1726855042.62544: starting attempt loop 18823 1726855042.62547: running the handler 18823 1726855042.62560: _low_level_execute_command(): starting 18823 1726855042.62571: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855042.63094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855042.63098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855042.63101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855042.63103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855042.63158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855042.63162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855042.63247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855042.64957: stdout chunk (state=3): >>>/root <<< 18823 1726855042.65114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855042.65118: stdout chunk (state=3): >>><<< 18823 1726855042.65120: stderr chunk (state=3): >>><<< 18823 1726855042.65142: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855042.65243: _low_level_execute_command(): starting 18823 1726855042.65247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749 `" && echo ansible-tmp-1726855042.651502-20482-100131242786749="` echo /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749 `" ) && sleep 0' 18823 1726855042.65741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855042.65793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855042.65806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855042.65884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855042.67799: stdout chunk (state=3): >>>ansible-tmp-1726855042.651502-20482-100131242786749=/root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749 <<< 18823 1726855042.67949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855042.67964: stderr chunk (state=3): >>><<< 18823 1726855042.67980: stdout chunk (state=3): >>><<< 18823 1726855042.67998: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855042.651502-20482-100131242786749=/root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855042.68200: variable 'ansible_module_compression' from source: unknown 18823 1726855042.68205: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18823 1726855042.68208: variable 'ansible_facts' from source: unknown 18823 1726855042.68269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/AnsiballZ_service_facts.py 18823 1726855042.68381: Sending initial data 18823 1726855042.68384: Sent initial data (161 bytes) 18823 1726855042.68930: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855042.68947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855042.68962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855042.69068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855042.69085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855042.69199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855042.70754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855042.70821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855042.70886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpxukcrslf /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/AnsiballZ_service_facts.py <<< 18823 1726855042.70896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/AnsiballZ_service_facts.py" <<< 18823 1726855042.70955: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpxukcrslf" to remote "/root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/AnsiballZ_service_facts.py" <<< 18823 1726855042.70963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/AnsiballZ_service_facts.py" <<< 18823 1726855042.71685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855042.71903: stderr chunk (state=3): >>><<< 18823 1726855042.71907: stdout chunk (state=3): >>><<< 18823 1726855042.71915: done transferring module to remote 18823 1726855042.71917: _low_level_execute_command(): starting 18823 1726855042.71919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/ /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/AnsiballZ_service_facts.py && sleep 0' 18823 1726855042.72507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855042.72521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855042.72533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855042.72578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855042.72593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855042.72676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855042.74569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855042.74577: stdout chunk (state=3): >>><<< 18823 1726855042.74580: stderr chunk (state=3): >>><<< 18823 1726855042.74583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855042.74585: _low_level_execute_command(): starting 18823 1726855042.74589: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/AnsiballZ_service_facts.py && sleep 0' 18823 1726855042.75740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855042.75744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855042.75824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855044.28346: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 18823 1726855044.28356: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 18823 1726855044.28480: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18823 1726855044.29808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855044.29850: stdout chunk (state=3): >>><<< 18823 1726855044.29853: stderr chunk (state=3): >>><<< 18823 1726855044.30094: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855044.31693: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855044.31697: _low_level_execute_command(): starting 18823 1726855044.31699: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855042.651502-20482-100131242786749/ > /dev/null 2>&1 && sleep 0' 18823 1726855044.33170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855044.33186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855044.33380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855044.33383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855044.33386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855044.33393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855044.33396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855044.33398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855044.33400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855044.33626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855044.33711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855044.35646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855044.35683: stderr chunk (state=3): >>><<< 18823 1726855044.35903: stdout chunk (state=3): >>><<< 18823 1726855044.35911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855044.35913: handler run complete 18823 1726855044.36323: variable 'ansible_facts' from source: unknown 18823 1726855044.36737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855044.39205: variable 'ansible_facts' from source: unknown 18823 1726855044.39560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855044.40169: attempt loop complete, returning result 18823 1726855044.40172: _execute() done 18823 1726855044.40174: dumping result to json 18823 1726855044.40386: done dumping result, returning 18823 1726855044.40396: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-d391-077c-00000000045d] 18823 1726855044.40401: sending task result for task 0affcc66-ac2b-d391-077c-00000000045d 18823 1726855044.42850: done sending task result for task 0affcc66-ac2b-d391-077c-00000000045d 18823 1726855044.42853: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855044.42957: no more pending results, returning what we have 18823 1726855044.42960: results queue empty 18823 1726855044.42961: checking for any_errors_fatal 18823 1726855044.42965: done checking for any_errors_fatal 18823 1726855044.42965: checking for max_fail_percentage 18823 1726855044.42967: done checking for max_fail_percentage 18823 1726855044.42968: checking to see if all hosts have failed and the running result is not ok 18823 1726855044.42968: done checking to see if all hosts have failed 18823 1726855044.42969: getting the remaining hosts for this loop 18823 1726855044.42971: done getting the remaining hosts for this loop 18823 1726855044.42974: getting the next task for host managed_node2 18823 1726855044.42980: done getting next task for host managed_node2 18823 1726855044.42983: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18823 1726855044.42986: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855044.42999: getting variables 18823 1726855044.43000: in VariableManager get_vars() 18823 1726855044.43031: Calling all_inventory to load vars for managed_node2 18823 1726855044.43033: Calling groups_inventory to load vars for managed_node2 18823 1726855044.43036: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855044.43044: Calling all_plugins_play to load vars for managed_node2 18823 1726855044.43047: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855044.43050: Calling groups_plugins_play to load vars for managed_node2 18823 1726855044.45915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855044.49494: done with get_vars() 18823 1726855044.49531: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:57:24 -0400 (0:00:01.886) 0:00:36.149 ****** 18823 1726855044.49755: entering _queue_task() for managed_node2/package_facts 18823 1726855044.50668: worker is 1 (out of 1 available) 18823 1726855044.50681: exiting _queue_task() for managed_node2/package_facts 18823 1726855044.50696: done queuing things up, now waiting for results queue to drain 18823 1726855044.50697: waiting for pending results... 18823 1726855044.51276: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 18823 1726855044.51834: in run() - task 0affcc66-ac2b-d391-077c-00000000045e 18823 1726855044.51847: variable 'ansible_search_path' from source: unknown 18823 1726855044.51973: variable 'ansible_search_path' from source: unknown 18823 1726855044.51978: calling self._execute() 18823 1726855044.52185: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855044.52394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855044.52398: variable 'omit' from source: magic vars 18823 1726855044.53002: variable 'ansible_distribution_major_version' from source: facts 18823 1726855044.53022: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855044.53109: variable 'omit' from source: magic vars 18823 1726855044.53168: variable 'omit' from source: magic vars 18823 1726855044.53253: variable 'omit' from source: magic vars 18823 1726855044.53366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855044.53470: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855044.53500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855044.53560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855044.53898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855044.53901: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855044.53903: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855044.53905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855044.53965: Set connection var ansible_timeout to 10 18823 1726855044.53977: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855044.53985: Set connection var ansible_shell_type to sh 18823 1726855044.54005: Set connection var ansible_shell_executable to /bin/sh 18823 1726855044.54017: Set connection var ansible_connection to ssh 18823 1726855044.54027: Set connection var ansible_pipelining to False 18823 1726855044.54061: variable 'ansible_shell_executable' from source: unknown 18823 1726855044.54071: variable 'ansible_connection' from source: unknown 18823 1726855044.54120: variable 'ansible_module_compression' from source: unknown 18823 1726855044.54129: variable 'ansible_shell_type' from source: unknown 18823 1726855044.54137: variable 'ansible_shell_executable' from source: unknown 18823 1726855044.54143: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855044.54150: variable 'ansible_pipelining' from source: unknown 18823 1726855044.54155: variable 'ansible_timeout' from source: unknown 18823 1726855044.54162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855044.54993: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855044.54998: variable 'omit' from source: magic vars 18823 1726855044.55001: starting attempt loop 18823 1726855044.55003: running the handler 18823 1726855044.55008: _low_level_execute_command(): starting 18823 1726855044.55010: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855044.56473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855044.56477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855044.56499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855044.56874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855044.57001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855044.58686: stdout chunk (state=3): >>>/root <<< 18823 1726855044.58841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855044.58845: stdout chunk (state=3): >>><<< 18823 1726855044.58847: stderr chunk (state=3): >>><<< 18823 1726855044.58864: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855044.58884: _low_level_execute_command(): starting 18823 1726855044.58899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946 `" && echo ansible-tmp-1726855044.5887086-20542-203357098078946="` echo /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946 `" ) && sleep 0' 18823 1726855044.60089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855044.60188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855044.60404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855044.60584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855044.62512: stdout chunk (state=3): >>>ansible-tmp-1726855044.5887086-20542-203357098078946=/root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946 <<< 18823 1726855044.62672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855044.62702: stdout chunk (state=3): >>><<< 18823 1726855044.62715: stderr chunk (state=3): >>><<< 18823 1726855044.62736: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855044.5887086-20542-203357098078946=/root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855044.62791: variable 'ansible_module_compression' from source: unknown 18823 1726855044.63093: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18823 1726855044.63493: variable 'ansible_facts' from source: unknown 18823 1726855044.63495: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/AnsiballZ_package_facts.py 18823 1726855044.64214: Sending initial data 18823 1726855044.64217: Sent initial data (162 bytes) 18823 1726855044.65670: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855044.65684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855044.65698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855044.65748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855044.65910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855044.65913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855044.66047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855044.67654: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18823 1726855044.67669: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855044.67864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855044.67937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmplsjn1nbr /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/AnsiballZ_package_facts.py <<< 18823 1726855044.68104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/AnsiballZ_package_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmplsjn1nbr" to remote "/root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/AnsiballZ_package_facts.py" <<< 18823 1726855044.71586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855044.71737: stderr chunk (state=3): >>><<< 18823 1726855044.71749: stdout chunk (state=3): >>><<< 18823 1726855044.71773: done transferring module to remote 18823 1726855044.71791: _low_level_execute_command(): starting 18823 1726855044.71977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/ /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/AnsiballZ_package_facts.py && sleep 0' 18823 1726855044.73012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855044.73177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855044.73310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855044.73342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855044.73461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855044.75651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855044.75661: stdout chunk (state=3): >>><<< 18823 1726855044.75671: stderr chunk (state=3): >>><<< 18823 1726855044.75690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855044.75707: _low_level_execute_command(): starting 18823 1726855044.75968: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/AnsiballZ_package_facts.py && sleep 0' 18823 1726855044.77199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855044.77397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855044.77442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855044.77459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855044.77563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855045.22507: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 18823 1726855045.22715: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 18823 1726855045.22853: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18823 1726855045.24615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855045.24627: stdout chunk (state=3): >>><<< 18823 1726855045.24646: stderr chunk (state=3): >>><<< 18823 1726855045.24684: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855045.35994: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855045.36048: _low_level_execute_command(): starting 18823 1726855045.36093: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855044.5887086-20542-203357098078946/ > /dev/null 2>&1 && sleep 0' 18823 1726855045.36879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855045.36947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855045.36950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855045.37047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855045.37065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855045.37082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855045.37115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855045.37224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855045.39220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855045.39262: stderr chunk (state=3): >>><<< 18823 1726855045.39345: stdout chunk (state=3): >>><<< 18823 1726855045.39501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855045.39506: handler run complete 18823 1726855045.40953: variable 'ansible_facts' from source: unknown 18823 1726855045.41850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855045.44289: variable 'ansible_facts' from source: unknown 18823 1726855045.44626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855045.45049: attempt loop complete, returning result 18823 1726855045.45059: _execute() done 18823 1726855045.45062: dumping result to json 18823 1726855045.45218: done dumping result, returning 18823 1726855045.45225: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-d391-077c-00000000045e] 18823 1726855045.45228: sending task result for task 0affcc66-ac2b-d391-077c-00000000045e 18823 1726855045.53071: done sending task result for task 0affcc66-ac2b-d391-077c-00000000045e 18823 1726855045.53074: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855045.53162: no more pending results, returning what we have 18823 1726855045.53165: results queue empty 18823 1726855045.53166: checking for any_errors_fatal 18823 1726855045.53169: done checking for any_errors_fatal 18823 1726855045.53170: checking for max_fail_percentage 18823 1726855045.53171: done checking for max_fail_percentage 18823 1726855045.53172: checking to see if all hosts have failed and the running result is not ok 18823 1726855045.53172: done checking to see if all hosts have failed 18823 1726855045.53173: getting the remaining hosts for this loop 18823 1726855045.53174: done getting the remaining hosts for this loop 18823 1726855045.53177: getting the next task for host managed_node2 18823 1726855045.53185: done getting next task for host managed_node2 18823 1726855045.53190: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18823 1726855045.53192: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855045.53200: getting variables 18823 1726855045.53201: in VariableManager get_vars() 18823 1726855045.53225: Calling all_inventory to load vars for managed_node2 18823 1726855045.53227: Calling groups_inventory to load vars for managed_node2 18823 1726855045.53229: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855045.53235: Calling all_plugins_play to load vars for managed_node2 18823 1726855045.53237: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855045.53240: Calling groups_plugins_play to load vars for managed_node2 18823 1726855045.54719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855045.57663: done with get_vars() 18823 1726855045.57702: done getting variables 18823 1726855045.57771: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:57:25 -0400 (0:00:01.082) 0:00:37.231 ****** 18823 1726855045.58017: entering _queue_task() for managed_node2/debug 18823 1726855045.58926: worker is 1 (out of 1 available) 18823 1726855045.58938: exiting _queue_task() for managed_node2/debug 18823 1726855045.58948: done queuing things up, now waiting for results queue to drain 18823 1726855045.58949: waiting for pending results... 18823 1726855045.59319: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 18823 1726855045.59608: in run() - task 0affcc66-ac2b-d391-077c-00000000005d 18823 1726855045.59613: variable 'ansible_search_path' from source: unknown 18823 1726855045.59618: variable 'ansible_search_path' from source: unknown 18823 1726855045.59822: calling self._execute() 18823 1726855045.60149: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855045.60153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855045.60157: variable 'omit' from source: magic vars 18823 1726855045.61148: variable 'ansible_distribution_major_version' from source: facts 18823 1726855045.61216: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855045.61296: variable 'omit' from source: magic vars 18823 1726855045.61413: variable 'omit' from source: magic vars 18823 1726855045.61843: variable 'network_provider' from source: set_fact 18823 1726855045.61892: variable 'omit' from source: magic vars 18823 1726855045.62109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855045.62118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855045.62190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855045.62379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855045.62382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855045.62385: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855045.62389: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855045.62391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855045.62758: Set connection var ansible_timeout to 10 18823 1726855045.62897: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855045.62901: Set connection var ansible_shell_type to sh 18823 1726855045.62903: Set connection var ansible_shell_executable to /bin/sh 18823 1726855045.62906: Set connection var ansible_connection to ssh 18823 1726855045.62908: Set connection var ansible_pipelining to False 18823 1726855045.62911: variable 'ansible_shell_executable' from source: unknown 18823 1726855045.62917: variable 'ansible_connection' from source: unknown 18823 1726855045.62921: variable 'ansible_module_compression' from source: unknown 18823 1726855045.62990: variable 'ansible_shell_type' from source: unknown 18823 1726855045.63010: variable 'ansible_shell_executable' from source: unknown 18823 1726855045.63078: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855045.63081: variable 'ansible_pipelining' from source: unknown 18823 1726855045.63084: variable 'ansible_timeout' from source: unknown 18823 1726855045.63086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855045.63476: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855045.63601: variable 'omit' from source: magic vars 18823 1726855045.63605: starting attempt loop 18823 1726855045.63608: running the handler 18823 1726855045.63802: handler run complete 18823 1726855045.63805: attempt loop complete, returning result 18823 1726855045.63807: _execute() done 18823 1726855045.63810: dumping result to json 18823 1726855045.63812: done dumping result, returning 18823 1726855045.63813: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-d391-077c-00000000005d] 18823 1726855045.63816: sending task result for task 0affcc66-ac2b-d391-077c-00000000005d 18823 1726855045.64093: done sending task result for task 0affcc66-ac2b-d391-077c-00000000005d 18823 1726855045.64097: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 18823 1726855045.64163: no more pending results, returning what we have 18823 1726855045.64167: results queue empty 18823 1726855045.64168: checking for any_errors_fatal 18823 1726855045.64180: done checking for any_errors_fatal 18823 1726855045.64181: checking for max_fail_percentage 18823 1726855045.64183: done checking for max_fail_percentage 18823 1726855045.64184: checking to see if all hosts have failed and the running result is not ok 18823 1726855045.64184: done checking to see if all hosts have failed 18823 1726855045.64185: getting the remaining hosts for this loop 18823 1726855045.64188: done getting the remaining hosts for this loop 18823 1726855045.64193: getting the next task for host managed_node2 18823 1726855045.64200: done getting next task for host managed_node2 18823 1726855045.64204: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18823 1726855045.64206: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855045.64216: getting variables 18823 1726855045.64218: in VariableManager get_vars() 18823 1726855045.64256: Calling all_inventory to load vars for managed_node2 18823 1726855045.64259: Calling groups_inventory to load vars for managed_node2 18823 1726855045.64262: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855045.64273: Calling all_plugins_play to load vars for managed_node2 18823 1726855045.64279: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855045.64689: Calling groups_plugins_play to load vars for managed_node2 18823 1726855045.68317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855045.71895: done with get_vars() 18823 1726855045.71929: done getting variables 18823 1726855045.72002: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:57:25 -0400 (0:00:00.140) 0:00:37.372 ****** 18823 1726855045.72033: entering _queue_task() for managed_node2/fail 18823 1726855045.72452: worker is 1 (out of 1 available) 18823 1726855045.72465: exiting _queue_task() for managed_node2/fail 18823 1726855045.72476: done queuing things up, now waiting for results queue to drain 18823 1726855045.72477: waiting for pending results... 18823 1726855045.72858: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18823 1726855045.72934: in run() - task 0affcc66-ac2b-d391-077c-00000000005e 18823 1726855045.73069: variable 'ansible_search_path' from source: unknown 18823 1726855045.73076: variable 'ansible_search_path' from source: unknown 18823 1726855045.73080: calling self._execute() 18823 1726855045.73141: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855045.73152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855045.73165: variable 'omit' from source: magic vars 18823 1726855045.73598: variable 'ansible_distribution_major_version' from source: facts 18823 1726855045.73629: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855045.73765: variable 'network_state' from source: role '' defaults 18823 1726855045.73781: Evaluated conditional (network_state != {}): False 18823 1726855045.73792: when evaluation is False, skipping this task 18823 1726855045.73799: _execute() done 18823 1726855045.73806: dumping result to json 18823 1726855045.73813: done dumping result, returning 18823 1726855045.73835: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-d391-077c-00000000005e] 18823 1726855045.73946: sending task result for task 0affcc66-ac2b-d391-077c-00000000005e 18823 1726855045.74015: done sending task result for task 0affcc66-ac2b-d391-077c-00000000005e 18823 1726855045.74017: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855045.74071: no more pending results, returning what we have 18823 1726855045.74075: results queue empty 18823 1726855045.74076: checking for any_errors_fatal 18823 1726855045.74083: done checking for any_errors_fatal 18823 1726855045.74084: checking for max_fail_percentage 18823 1726855045.74086: done checking for max_fail_percentage 18823 1726855045.74086: checking to see if all hosts have failed and the running result is not ok 18823 1726855045.74089: done checking to see if all hosts have failed 18823 1726855045.74090: getting the remaining hosts for this loop 18823 1726855045.74091: done getting the remaining hosts for this loop 18823 1726855045.74095: getting the next task for host managed_node2 18823 1726855045.74103: done getting next task for host managed_node2 18823 1726855045.74106: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18823 1726855045.74108: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855045.74123: getting variables 18823 1726855045.74125: in VariableManager get_vars() 18823 1726855045.74283: Calling all_inventory to load vars for managed_node2 18823 1726855045.74286: Calling groups_inventory to load vars for managed_node2 18823 1726855045.74291: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855045.74304: Calling all_plugins_play to load vars for managed_node2 18823 1726855045.74308: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855045.74311: Calling groups_plugins_play to load vars for managed_node2 18823 1726855045.76012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855045.77604: done with get_vars() 18823 1726855045.77632: done getting variables 18823 1726855045.77693: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:57:25 -0400 (0:00:00.056) 0:00:37.428 ****** 18823 1726855045.77724: entering _queue_task() for managed_node2/fail 18823 1726855045.78479: worker is 1 (out of 1 available) 18823 1726855045.78495: exiting _queue_task() for managed_node2/fail 18823 1726855045.78508: done queuing things up, now waiting for results queue to drain 18823 1726855045.78509: waiting for pending results... 18823 1726855045.78981: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18823 1726855045.79076: in run() - task 0affcc66-ac2b-d391-077c-00000000005f 18823 1726855045.79083: variable 'ansible_search_path' from source: unknown 18823 1726855045.79089: variable 'ansible_search_path' from source: unknown 18823 1726855045.79331: calling self._execute() 18823 1726855045.79434: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855045.79493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855045.79497: variable 'omit' from source: magic vars 18823 1726855045.80251: variable 'ansible_distribution_major_version' from source: facts 18823 1726855045.80261: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855045.80668: variable 'network_state' from source: role '' defaults 18823 1726855045.80681: Evaluated conditional (network_state != {}): False 18823 1726855045.80685: when evaluation is False, skipping this task 18823 1726855045.80691: _execute() done 18823 1726855045.80694: dumping result to json 18823 1726855045.80696: done dumping result, returning 18823 1726855045.80902: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-d391-077c-00000000005f] 18823 1726855045.80906: sending task result for task 0affcc66-ac2b-d391-077c-00000000005f 18823 1726855045.80983: done sending task result for task 0affcc66-ac2b-d391-077c-00000000005f 18823 1726855045.80989: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855045.81044: no more pending results, returning what we have 18823 1726855045.81049: results queue empty 18823 1726855045.81050: checking for any_errors_fatal 18823 1726855045.81059: done checking for any_errors_fatal 18823 1726855045.81060: checking for max_fail_percentage 18823 1726855045.81062: done checking for max_fail_percentage 18823 1726855045.81062: checking to see if all hosts have failed and the running result is not ok 18823 1726855045.81063: done checking to see if all hosts have failed 18823 1726855045.81064: getting the remaining hosts for this loop 18823 1726855045.81066: done getting the remaining hosts for this loop 18823 1726855045.81070: getting the next task for host managed_node2 18823 1726855045.81078: done getting next task for host managed_node2 18823 1726855045.81082: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18823 1726855045.81085: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855045.81104: getting variables 18823 1726855045.81106: in VariableManager get_vars() 18823 1726855045.81151: Calling all_inventory to load vars for managed_node2 18823 1726855045.81154: Calling groups_inventory to load vars for managed_node2 18823 1726855045.81157: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855045.81172: Calling all_plugins_play to load vars for managed_node2 18823 1726855045.81175: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855045.81179: Calling groups_plugins_play to load vars for managed_node2 18823 1726855045.85478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855045.89416: done with get_vars() 18823 1726855045.89446: done getting variables 18823 1726855045.89723: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:57:25 -0400 (0:00:00.120) 0:00:37.549 ****** 18823 1726855045.89757: entering _queue_task() for managed_node2/fail 18823 1726855045.90516: worker is 1 (out of 1 available) 18823 1726855045.90529: exiting _queue_task() for managed_node2/fail 18823 1726855045.90539: done queuing things up, now waiting for results queue to drain 18823 1726855045.90540: waiting for pending results... 18823 1726855045.90909: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18823 1726855045.90914: in run() - task 0affcc66-ac2b-d391-077c-000000000060 18823 1726855045.90917: variable 'ansible_search_path' from source: unknown 18823 1726855045.90920: variable 'ansible_search_path' from source: unknown 18823 1726855045.90960: calling self._execute() 18823 1726855045.91059: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855045.91070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855045.91084: variable 'omit' from source: magic vars 18823 1726855045.91470: variable 'ansible_distribution_major_version' from source: facts 18823 1726855045.91489: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855045.91698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855045.95622: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855045.95761: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855045.95765: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855045.95768: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855045.95805: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855045.95991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855045.96036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855045.96066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855045.96117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855045.96136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855045.96235: variable 'ansible_distribution_major_version' from source: facts 18823 1726855045.96254: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18823 1726855045.96373: variable 'ansible_distribution' from source: facts 18823 1726855045.96382: variable '__network_rh_distros' from source: role '' defaults 18823 1726855045.96411: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18823 1726855045.96660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855045.96735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855045.96738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855045.96761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855045.96777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855045.96823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855045.96852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855045.96876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855045.96920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855045.96936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855045.96994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855045.97009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855045.97035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855045.97276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855045.97280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855045.97747: variable 'network_connections' from source: play vars 18823 1726855045.97834: variable 'profile' from source: play vars 18823 1726855045.97913: variable 'profile' from source: play vars 18823 1726855045.98148: variable 'interface' from source: set_fact 18823 1726855045.98151: variable 'interface' from source: set_fact 18823 1726855045.98153: variable 'network_state' from source: role '' defaults 18823 1726855045.98307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855045.98709: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855045.98753: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855045.98797: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855045.99000: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855045.99003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855045.99050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855045.99119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855045.99202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855045.99266: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18823 1726855045.99300: when evaluation is False, skipping this task 18823 1726855045.99354: _execute() done 18823 1726855045.99362: dumping result to json 18823 1726855045.99369: done dumping result, returning 18823 1726855045.99382: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-d391-077c-000000000060] 18823 1726855045.99395: sending task result for task 0affcc66-ac2b-d391-077c-000000000060 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18823 1726855045.99610: no more pending results, returning what we have 18823 1726855045.99614: results queue empty 18823 1726855045.99615: checking for any_errors_fatal 18823 1726855045.99622: done checking for any_errors_fatal 18823 1726855045.99623: checking for max_fail_percentage 18823 1726855045.99625: done checking for max_fail_percentage 18823 1726855045.99625: checking to see if all hosts have failed and the running result is not ok 18823 1726855045.99626: done checking to see if all hosts have failed 18823 1726855045.99627: getting the remaining hosts for this loop 18823 1726855045.99628: done getting the remaining hosts for this loop 18823 1726855045.99632: getting the next task for host managed_node2 18823 1726855045.99640: done getting next task for host managed_node2 18823 1726855045.99645: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18823 1726855045.99647: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855045.99660: getting variables 18823 1726855045.99662: in VariableManager get_vars() 18823 1726855045.99705: Calling all_inventory to load vars for managed_node2 18823 1726855045.99708: Calling groups_inventory to load vars for managed_node2 18823 1726855045.99711: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855045.99722: Calling all_plugins_play to load vars for managed_node2 18823 1726855045.99725: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855045.99729: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.00853: done sending task result for task 0affcc66-ac2b-d391-077c-000000000060 18823 1726855046.00856: WORKER PROCESS EXITING 18823 1726855046.02789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.04099: done with get_vars() 18823 1726855046.04125: done getting variables 18823 1726855046.04171: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:57:26 -0400 (0:00:00.144) 0:00:37.693 ****** 18823 1726855046.04198: entering _queue_task() for managed_node2/dnf 18823 1726855046.04464: worker is 1 (out of 1 available) 18823 1726855046.04479: exiting _queue_task() for managed_node2/dnf 18823 1726855046.04492: done queuing things up, now waiting for results queue to drain 18823 1726855046.04493: waiting for pending results... 18823 1726855046.04675: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18823 1726855046.04755: in run() - task 0affcc66-ac2b-d391-077c-000000000061 18823 1726855046.04766: variable 'ansible_search_path' from source: unknown 18823 1726855046.04770: variable 'ansible_search_path' from source: unknown 18823 1726855046.04801: calling self._execute() 18823 1726855046.04874: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.04879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.04889: variable 'omit' from source: magic vars 18823 1726855046.05229: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.05242: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.05602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855046.08278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855046.08328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855046.08359: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855046.08386: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855046.08412: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855046.08472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.08758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.08777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.08808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.08820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.08908: variable 'ansible_distribution' from source: facts 18823 1726855046.08912: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.08923: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18823 1726855046.09010: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855046.09097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.09115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.09132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.09163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.09174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.09206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.09221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.09237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.09265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.09276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.09307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.09321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.09338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.09365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.09374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.09470: variable 'network_connections' from source: play vars 18823 1726855046.09480: variable 'profile' from source: play vars 18823 1726855046.09548: variable 'profile' from source: play vars 18823 1726855046.09551: variable 'interface' from source: set_fact 18823 1726855046.09668: variable 'interface' from source: set_fact 18823 1726855046.09671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855046.09948: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855046.09951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855046.09954: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855046.09956: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855046.10001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855046.10024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855046.10049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.10082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855046.10493: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855046.10798: variable 'network_connections' from source: play vars 18823 1726855046.10801: variable 'profile' from source: play vars 18823 1726855046.10866: variable 'profile' from source: play vars 18823 1726855046.10870: variable 'interface' from source: set_fact 18823 1726855046.10934: variable 'interface' from source: set_fact 18823 1726855046.10958: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855046.10961: when evaluation is False, skipping this task 18823 1726855046.10964: _execute() done 18823 1726855046.10966: dumping result to json 18823 1726855046.10968: done dumping result, returning 18823 1726855046.10998: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000061] 18823 1726855046.11001: sending task result for task 0affcc66-ac2b-d391-077c-000000000061 18823 1726855046.11168: done sending task result for task 0affcc66-ac2b-d391-077c-000000000061 18823 1726855046.11171: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855046.11225: no more pending results, returning what we have 18823 1726855046.11228: results queue empty 18823 1726855046.11229: checking for any_errors_fatal 18823 1726855046.11237: done checking for any_errors_fatal 18823 1726855046.11238: checking for max_fail_percentage 18823 1726855046.11239: done checking for max_fail_percentage 18823 1726855046.11240: checking to see if all hosts have failed and the running result is not ok 18823 1726855046.11241: done checking to see if all hosts have failed 18823 1726855046.11241: getting the remaining hosts for this loop 18823 1726855046.11243: done getting the remaining hosts for this loop 18823 1726855046.11246: getting the next task for host managed_node2 18823 1726855046.11252: done getting next task for host managed_node2 18823 1726855046.11255: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18823 1726855046.11257: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855046.11269: getting variables 18823 1726855046.11270: in VariableManager get_vars() 18823 1726855046.11307: Calling all_inventory to load vars for managed_node2 18823 1726855046.11310: Calling groups_inventory to load vars for managed_node2 18823 1726855046.11311: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855046.11320: Calling all_plugins_play to load vars for managed_node2 18823 1726855046.11323: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855046.11325: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.12919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.14557: done with get_vars() 18823 1726855046.14585: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18823 1726855046.14669: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:57:26 -0400 (0:00:00.104) 0:00:37.798 ****** 18823 1726855046.14701: entering _queue_task() for managed_node2/yum 18823 1726855046.15206: worker is 1 (out of 1 available) 18823 1726855046.15220: exiting _queue_task() for managed_node2/yum 18823 1726855046.15231: done queuing things up, now waiting for results queue to drain 18823 1726855046.15232: waiting for pending results... 18823 1726855046.15508: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18823 1726855046.15522: in run() - task 0affcc66-ac2b-d391-077c-000000000062 18823 1726855046.15537: variable 'ansible_search_path' from source: unknown 18823 1726855046.15541: variable 'ansible_search_path' from source: unknown 18823 1726855046.15581: calling self._execute() 18823 1726855046.15680: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.15683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.15699: variable 'omit' from source: magic vars 18823 1726855046.16108: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.16173: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.16307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855046.19010: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855046.19194: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855046.19224: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855046.19377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855046.19409: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855046.19510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.19552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.19670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.19673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.19686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.19792: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.19817: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18823 1726855046.19820: when evaluation is False, skipping this task 18823 1726855046.19825: _execute() done 18823 1726855046.19827: dumping result to json 18823 1726855046.19829: done dumping result, returning 18823 1726855046.19832: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000062] 18823 1726855046.19838: sending task result for task 0affcc66-ac2b-d391-077c-000000000062 18823 1726855046.20093: done sending task result for task 0affcc66-ac2b-d391-077c-000000000062 18823 1726855046.20097: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18823 1726855046.20146: no more pending results, returning what we have 18823 1726855046.20149: results queue empty 18823 1726855046.20150: checking for any_errors_fatal 18823 1726855046.20156: done checking for any_errors_fatal 18823 1726855046.20157: checking for max_fail_percentage 18823 1726855046.20159: done checking for max_fail_percentage 18823 1726855046.20160: checking to see if all hosts have failed and the running result is not ok 18823 1726855046.20160: done checking to see if all hosts have failed 18823 1726855046.20161: getting the remaining hosts for this loop 18823 1726855046.20163: done getting the remaining hosts for this loop 18823 1726855046.20166: getting the next task for host managed_node2 18823 1726855046.20172: done getting next task for host managed_node2 18823 1726855046.20176: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18823 1726855046.20178: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855046.20192: getting variables 18823 1726855046.20194: in VariableManager get_vars() 18823 1726855046.20235: Calling all_inventory to load vars for managed_node2 18823 1726855046.20238: Calling groups_inventory to load vars for managed_node2 18823 1726855046.20241: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855046.20251: Calling all_plugins_play to load vars for managed_node2 18823 1726855046.20255: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855046.20258: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.22257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.25220: done with get_vars() 18823 1726855046.25255: done getting variables 18823 1726855046.25327: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:57:26 -0400 (0:00:00.106) 0:00:37.905 ****** 18823 1726855046.25358: entering _queue_task() for managed_node2/fail 18823 1726855046.25855: worker is 1 (out of 1 available) 18823 1726855046.25866: exiting _queue_task() for managed_node2/fail 18823 1726855046.25876: done queuing things up, now waiting for results queue to drain 18823 1726855046.25877: waiting for pending results... 18823 1726855046.26312: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18823 1726855046.26317: in run() - task 0affcc66-ac2b-d391-077c-000000000063 18823 1726855046.26321: variable 'ansible_search_path' from source: unknown 18823 1726855046.26324: variable 'ansible_search_path' from source: unknown 18823 1726855046.26327: calling self._execute() 18823 1726855046.26353: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.26357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.26400: variable 'omit' from source: magic vars 18823 1726855046.27310: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.27321: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.27457: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855046.27886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855046.31313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855046.31378: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855046.31424: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855046.31458: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855046.31496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855046.31585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.31617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.31644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.31698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.31713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.31759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.31788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.31821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.31858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.31878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.31965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.31969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.31971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.32022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.32037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.32216: variable 'network_connections' from source: play vars 18823 1726855046.32237: variable 'profile' from source: play vars 18823 1726855046.32392: variable 'profile' from source: play vars 18823 1726855046.32400: variable 'interface' from source: set_fact 18823 1726855046.32406: variable 'interface' from source: set_fact 18823 1726855046.32473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855046.32668: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855046.32707: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855046.32737: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855046.32774: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855046.32817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855046.32840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855046.32875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.32906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855046.32948: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855046.33210: variable 'network_connections' from source: play vars 18823 1726855046.33215: variable 'profile' from source: play vars 18823 1726855046.33286: variable 'profile' from source: play vars 18823 1726855046.33291: variable 'interface' from source: set_fact 18823 1726855046.33595: variable 'interface' from source: set_fact 18823 1726855046.33598: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855046.33600: when evaluation is False, skipping this task 18823 1726855046.33602: _execute() done 18823 1726855046.33606: dumping result to json 18823 1726855046.33608: done dumping result, returning 18823 1726855046.33609: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000063] 18823 1726855046.33619: sending task result for task 0affcc66-ac2b-d391-077c-000000000063 18823 1726855046.33684: done sending task result for task 0affcc66-ac2b-d391-077c-000000000063 18823 1726855046.33689: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855046.33908: no more pending results, returning what we have 18823 1726855046.33911: results queue empty 18823 1726855046.33912: checking for any_errors_fatal 18823 1726855046.33918: done checking for any_errors_fatal 18823 1726855046.33919: checking for max_fail_percentage 18823 1726855046.33921: done checking for max_fail_percentage 18823 1726855046.33922: checking to see if all hosts have failed and the running result is not ok 18823 1726855046.33922: done checking to see if all hosts have failed 18823 1726855046.33923: getting the remaining hosts for this loop 18823 1726855046.33924: done getting the remaining hosts for this loop 18823 1726855046.33931: getting the next task for host managed_node2 18823 1726855046.33936: done getting next task for host managed_node2 18823 1726855046.33940: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18823 1726855046.33943: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855046.33955: getting variables 18823 1726855046.33957: in VariableManager get_vars() 18823 1726855046.33994: Calling all_inventory to load vars for managed_node2 18823 1726855046.33997: Calling groups_inventory to load vars for managed_node2 18823 1726855046.34000: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855046.34009: Calling all_plugins_play to load vars for managed_node2 18823 1726855046.34012: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855046.34015: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.35661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.37302: done with get_vars() 18823 1726855046.37327: done getting variables 18823 1726855046.37401: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:57:26 -0400 (0:00:00.120) 0:00:38.026 ****** 18823 1726855046.37437: entering _queue_task() for managed_node2/package 18823 1726855046.37913: worker is 1 (out of 1 available) 18823 1726855046.37926: exiting _queue_task() for managed_node2/package 18823 1726855046.37937: done queuing things up, now waiting for results queue to drain 18823 1726855046.37938: waiting for pending results... 18823 1726855046.38311: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 18823 1726855046.38317: in run() - task 0affcc66-ac2b-d391-077c-000000000064 18823 1726855046.38320: variable 'ansible_search_path' from source: unknown 18823 1726855046.38322: variable 'ansible_search_path' from source: unknown 18823 1726855046.38325: calling self._execute() 18823 1726855046.38406: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.38410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.38418: variable 'omit' from source: magic vars 18823 1726855046.38818: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.38837: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.39048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855046.39335: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855046.39384: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855046.39427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855046.39520: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855046.39640: variable 'network_packages' from source: role '' defaults 18823 1726855046.39753: variable '__network_provider_setup' from source: role '' defaults 18823 1726855046.39777: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855046.39855: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855046.39876: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855046.39948: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855046.40151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855046.42247: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855046.42325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855046.42375: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855046.42419: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855046.42450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855046.42543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.42598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.42617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.42662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.42707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.42743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.42771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.42815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.42853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.42893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.43124: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18823 1726855046.43250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.43392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.43395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.43397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.43399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.43455: variable 'ansible_python' from source: facts 18823 1726855046.43485: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18823 1726855046.43586: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855046.43675: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855046.43825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.43860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.43890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.43934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.43963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.44060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.44072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.44079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.44125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.44144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.44601: variable 'network_connections' from source: play vars 18823 1726855046.44604: variable 'profile' from source: play vars 18823 1726855046.44640: variable 'profile' from source: play vars 18823 1726855046.44653: variable 'interface' from source: set_fact 18823 1726855046.44735: variable 'interface' from source: set_fact 18823 1726855046.44875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855046.44956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855046.45068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.45163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855046.45215: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855046.45743: variable 'network_connections' from source: play vars 18823 1726855046.45763: variable 'profile' from source: play vars 18823 1726855046.45881: variable 'profile' from source: play vars 18823 1726855046.45901: variable 'interface' from source: set_fact 18823 1726855046.45977: variable 'interface' from source: set_fact 18823 1726855046.46093: variable '__network_packages_default_wireless' from source: role '' defaults 18823 1726855046.46127: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855046.46454: variable 'network_connections' from source: play vars 18823 1726855046.46464: variable 'profile' from source: play vars 18823 1726855046.46531: variable 'profile' from source: play vars 18823 1726855046.46540: variable 'interface' from source: set_fact 18823 1726855046.46647: variable 'interface' from source: set_fact 18823 1726855046.46685: variable '__network_packages_default_team' from source: role '' defaults 18823 1726855046.46777: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855046.47116: variable 'network_connections' from source: play vars 18823 1726855046.47126: variable 'profile' from source: play vars 18823 1726855046.47215: variable 'profile' from source: play vars 18823 1726855046.47218: variable 'interface' from source: set_fact 18823 1726855046.47306: variable 'interface' from source: set_fact 18823 1726855046.47370: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855046.47443: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855046.47494: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855046.47520: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855046.47746: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18823 1726855046.48339: variable 'network_connections' from source: play vars 18823 1726855046.48349: variable 'profile' from source: play vars 18823 1726855046.48422: variable 'profile' from source: play vars 18823 1726855046.48431: variable 'interface' from source: set_fact 18823 1726855046.48521: variable 'interface' from source: set_fact 18823 1726855046.48524: variable 'ansible_distribution' from source: facts 18823 1726855046.48526: variable '__network_rh_distros' from source: role '' defaults 18823 1726855046.48536: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.48593: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18823 1726855046.48738: variable 'ansible_distribution' from source: facts 18823 1726855046.48853: variable '__network_rh_distros' from source: role '' defaults 18823 1726855046.48856: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.48858: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18823 1726855046.48942: variable 'ansible_distribution' from source: facts 18823 1726855046.48951: variable '__network_rh_distros' from source: role '' defaults 18823 1726855046.48963: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.49009: variable 'network_provider' from source: set_fact 18823 1726855046.49029: variable 'ansible_facts' from source: unknown 18823 1726855046.49979: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18823 1726855046.49986: when evaluation is False, skipping this task 18823 1726855046.49994: _execute() done 18823 1726855046.50001: dumping result to json 18823 1726855046.50009: done dumping result, returning 18823 1726855046.50020: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-d391-077c-000000000064] 18823 1726855046.50047: sending task result for task 0affcc66-ac2b-d391-077c-000000000064 18823 1726855046.50210: done sending task result for task 0affcc66-ac2b-d391-077c-000000000064 18823 1726855046.50213: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18823 1726855046.50261: no more pending results, returning what we have 18823 1726855046.50265: results queue empty 18823 1726855046.50266: checking for any_errors_fatal 18823 1726855046.50274: done checking for any_errors_fatal 18823 1726855046.50275: checking for max_fail_percentage 18823 1726855046.50276: done checking for max_fail_percentage 18823 1726855046.50277: checking to see if all hosts have failed and the running result is not ok 18823 1726855046.50278: done checking to see if all hosts have failed 18823 1726855046.50278: getting the remaining hosts for this loop 18823 1726855046.50280: done getting the remaining hosts for this loop 18823 1726855046.50284: getting the next task for host managed_node2 18823 1726855046.50294: done getting next task for host managed_node2 18823 1726855046.50297: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18823 1726855046.50299: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855046.50313: getting variables 18823 1726855046.50315: in VariableManager get_vars() 18823 1726855046.50353: Calling all_inventory to load vars for managed_node2 18823 1726855046.50356: Calling groups_inventory to load vars for managed_node2 18823 1726855046.50359: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855046.50376: Calling all_plugins_play to load vars for managed_node2 18823 1726855046.50380: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855046.50383: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.52097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.53958: done with get_vars() 18823 1726855046.53991: done getting variables 18823 1726855046.54052: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:57:26 -0400 (0:00:00.166) 0:00:38.192 ****** 18823 1726855046.54082: entering _queue_task() for managed_node2/package 18823 1726855046.54507: worker is 1 (out of 1 available) 18823 1726855046.54525: exiting _queue_task() for managed_node2/package 18823 1726855046.54536: done queuing things up, now waiting for results queue to drain 18823 1726855046.54537: waiting for pending results... 18823 1726855046.54811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18823 1726855046.54879: in run() - task 0affcc66-ac2b-d391-077c-000000000065 18823 1726855046.54994: variable 'ansible_search_path' from source: unknown 18823 1726855046.54997: variable 'ansible_search_path' from source: unknown 18823 1726855046.55000: calling self._execute() 18823 1726855046.55058: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.55070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.55084: variable 'omit' from source: magic vars 18823 1726855046.55477: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.55498: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.55628: variable 'network_state' from source: role '' defaults 18823 1726855046.55642: Evaluated conditional (network_state != {}): False 18823 1726855046.55648: when evaluation is False, skipping this task 18823 1726855046.55655: _execute() done 18823 1726855046.55671: dumping result to json 18823 1726855046.55678: done dumping result, returning 18823 1726855046.55691: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-d391-077c-000000000065] 18823 1726855046.55702: sending task result for task 0affcc66-ac2b-d391-077c-000000000065 18823 1726855046.55921: done sending task result for task 0affcc66-ac2b-d391-077c-000000000065 18823 1726855046.55925: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855046.55975: no more pending results, returning what we have 18823 1726855046.55979: results queue empty 18823 1726855046.55980: checking for any_errors_fatal 18823 1726855046.55991: done checking for any_errors_fatal 18823 1726855046.55992: checking for max_fail_percentage 18823 1726855046.55994: done checking for max_fail_percentage 18823 1726855046.55995: checking to see if all hosts have failed and the running result is not ok 18823 1726855046.55995: done checking to see if all hosts have failed 18823 1726855046.55996: getting the remaining hosts for this loop 18823 1726855046.55998: done getting the remaining hosts for this loop 18823 1726855046.56001: getting the next task for host managed_node2 18823 1726855046.56009: done getting next task for host managed_node2 18823 1726855046.56013: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18823 1726855046.56016: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855046.56030: getting variables 18823 1726855046.56032: in VariableManager get_vars() 18823 1726855046.56070: Calling all_inventory to load vars for managed_node2 18823 1726855046.56073: Calling groups_inventory to load vars for managed_node2 18823 1726855046.56076: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855046.56205: Calling all_plugins_play to load vars for managed_node2 18823 1726855046.56209: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855046.56215: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.57626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.59310: done with get_vars() 18823 1726855046.59334: done getting variables 18823 1726855046.59402: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:57:26 -0400 (0:00:00.053) 0:00:38.245 ****** 18823 1726855046.59431: entering _queue_task() for managed_node2/package 18823 1726855046.59760: worker is 1 (out of 1 available) 18823 1726855046.59775: exiting _queue_task() for managed_node2/package 18823 1726855046.60095: done queuing things up, now waiting for results queue to drain 18823 1726855046.60097: waiting for pending results... 18823 1726855046.60550: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18823 1726855046.60680: in run() - task 0affcc66-ac2b-d391-077c-000000000066 18823 1726855046.60685: variable 'ansible_search_path' from source: unknown 18823 1726855046.60790: variable 'ansible_search_path' from source: unknown 18823 1726855046.60794: calling self._execute() 18823 1726855046.61196: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.61200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.61203: variable 'omit' from source: magic vars 18823 1726855046.62011: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.62077: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.62339: variable 'network_state' from source: role '' defaults 18823 1726855046.62511: Evaluated conditional (network_state != {}): False 18823 1726855046.62515: when evaluation is False, skipping this task 18823 1726855046.62517: _execute() done 18823 1726855046.62519: dumping result to json 18823 1726855046.62521: done dumping result, returning 18823 1726855046.62524: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-d391-077c-000000000066] 18823 1726855046.62526: sending task result for task 0affcc66-ac2b-d391-077c-000000000066 18823 1726855046.62598: done sending task result for task 0affcc66-ac2b-d391-077c-000000000066 18823 1726855046.62601: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855046.62660: no more pending results, returning what we have 18823 1726855046.62664: results queue empty 18823 1726855046.62665: checking for any_errors_fatal 18823 1726855046.62673: done checking for any_errors_fatal 18823 1726855046.62674: checking for max_fail_percentage 18823 1726855046.62676: done checking for max_fail_percentage 18823 1726855046.62677: checking to see if all hosts have failed and the running result is not ok 18823 1726855046.62677: done checking to see if all hosts have failed 18823 1726855046.62678: getting the remaining hosts for this loop 18823 1726855046.62679: done getting the remaining hosts for this loop 18823 1726855046.62683: getting the next task for host managed_node2 18823 1726855046.62692: done getting next task for host managed_node2 18823 1726855046.62696: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18823 1726855046.62698: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855046.62720: getting variables 18823 1726855046.62723: in VariableManager get_vars() 18823 1726855046.62761: Calling all_inventory to load vars for managed_node2 18823 1726855046.62765: Calling groups_inventory to load vars for managed_node2 18823 1726855046.62767: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855046.62780: Calling all_plugins_play to load vars for managed_node2 18823 1726855046.62783: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855046.62786: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.67344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.69768: done with get_vars() 18823 1726855046.69797: done getting variables 18823 1726855046.69866: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:57:26 -0400 (0:00:00.104) 0:00:38.350 ****** 18823 1726855046.69900: entering _queue_task() for managed_node2/service 18823 1726855046.70250: worker is 1 (out of 1 available) 18823 1726855046.70263: exiting _queue_task() for managed_node2/service 18823 1726855046.70274: done queuing things up, now waiting for results queue to drain 18823 1726855046.70393: waiting for pending results... 18823 1726855046.70693: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18823 1726855046.70708: in run() - task 0affcc66-ac2b-d391-077c-000000000067 18823 1726855046.70732: variable 'ansible_search_path' from source: unknown 18823 1726855046.70739: variable 'ansible_search_path' from source: unknown 18823 1726855046.70778: calling self._execute() 18823 1726855046.71056: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.71103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.71107: variable 'omit' from source: magic vars 18823 1726855046.72026: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.72055: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.72194: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855046.72440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855046.74765: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855046.74853: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855046.74902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855046.74993: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855046.74997: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855046.75078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.75130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.75168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.75215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.75236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.75308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.75376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.75379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.75422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.75446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.75503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.75532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.75567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.75703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.75708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.76200: variable 'network_connections' from source: play vars 18823 1726855046.76263: variable 'profile' from source: play vars 18823 1726855046.76575: variable 'profile' from source: play vars 18823 1726855046.76694: variable 'interface' from source: set_fact 18823 1726855046.76698: variable 'interface' from source: set_fact 18823 1726855046.76721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855046.77212: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855046.77234: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855046.77268: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855046.77424: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855046.77463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855046.77491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855046.77651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.77771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855046.77982: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855046.78181: variable 'network_connections' from source: play vars 18823 1726855046.78185: variable 'profile' from source: play vars 18823 1726855046.78250: variable 'profile' from source: play vars 18823 1726855046.78253: variable 'interface' from source: set_fact 18823 1726855046.78341: variable 'interface' from source: set_fact 18823 1726855046.78400: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18823 1726855046.78406: when evaluation is False, skipping this task 18823 1726855046.78408: _execute() done 18823 1726855046.78411: dumping result to json 18823 1726855046.78413: done dumping result, returning 18823 1726855046.78415: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-d391-077c-000000000067] 18823 1726855046.78425: sending task result for task 0affcc66-ac2b-d391-077c-000000000067 18823 1726855046.78741: done sending task result for task 0affcc66-ac2b-d391-077c-000000000067 18823 1726855046.78744: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18823 1726855046.78784: no more pending results, returning what we have 18823 1726855046.78789: results queue empty 18823 1726855046.78791: checking for any_errors_fatal 18823 1726855046.78797: done checking for any_errors_fatal 18823 1726855046.78797: checking for max_fail_percentage 18823 1726855046.78799: done checking for max_fail_percentage 18823 1726855046.78800: checking to see if all hosts have failed and the running result is not ok 18823 1726855046.78800: done checking to see if all hosts have failed 18823 1726855046.78801: getting the remaining hosts for this loop 18823 1726855046.78802: done getting the remaining hosts for this loop 18823 1726855046.78806: getting the next task for host managed_node2 18823 1726855046.78811: done getting next task for host managed_node2 18823 1726855046.78815: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18823 1726855046.78817: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855046.78830: getting variables 18823 1726855046.78832: in VariableManager get_vars() 18823 1726855046.78869: Calling all_inventory to load vars for managed_node2 18823 1726855046.78871: Calling groups_inventory to load vars for managed_node2 18823 1726855046.78874: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855046.78883: Calling all_plugins_play to load vars for managed_node2 18823 1726855046.78886: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855046.78891: Calling groups_plugins_play to load vars for managed_node2 18823 1726855046.80454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855046.83045: done with get_vars() 18823 1726855046.83093: done getting variables 18823 1726855046.83153: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:57:26 -0400 (0:00:00.132) 0:00:38.483 ****** 18823 1726855046.83189: entering _queue_task() for managed_node2/service 18823 1726855046.83529: worker is 1 (out of 1 available) 18823 1726855046.83540: exiting _queue_task() for managed_node2/service 18823 1726855046.83552: done queuing things up, now waiting for results queue to drain 18823 1726855046.83553: waiting for pending results... 18823 1726855046.83912: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18823 1726855046.83979: in run() - task 0affcc66-ac2b-d391-077c-000000000068 18823 1726855046.83994: variable 'ansible_search_path' from source: unknown 18823 1726855046.84197: variable 'ansible_search_path' from source: unknown 18823 1726855046.84201: calling self._execute() 18823 1726855046.84208: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.84211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.84214: variable 'omit' from source: magic vars 18823 1726855046.84517: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.84532: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855046.84696: variable 'network_provider' from source: set_fact 18823 1726855046.84709: variable 'network_state' from source: role '' defaults 18823 1726855046.84716: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18823 1726855046.84722: variable 'omit' from source: magic vars 18823 1726855046.84767: variable 'omit' from source: magic vars 18823 1726855046.84800: variable 'network_service_name' from source: role '' defaults 18823 1726855046.84880: variable 'network_service_name' from source: role '' defaults 18823 1726855046.85039: variable '__network_provider_setup' from source: role '' defaults 18823 1726855046.85045: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855046.85195: variable '__network_service_name_default_nm' from source: role '' defaults 18823 1726855046.85199: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855046.85212: variable '__network_packages_default_nm' from source: role '' defaults 18823 1726855046.85549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855046.88804: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855046.88808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855046.88811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855046.88835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855046.88865: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855046.88972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.89018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.89045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.89089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.89153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.89190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.89246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.89249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.89289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.89298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.89615: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18823 1726855046.89746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.89796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.89811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.89911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.89918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.89962: variable 'ansible_python' from source: facts 18823 1726855046.89991: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18823 1726855046.90073: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855046.90156: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855046.90285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.90337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.90345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.90454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.90461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.90464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855046.90475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855046.90508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.90550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855046.90564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855046.90711: variable 'network_connections' from source: play vars 18823 1726855046.90718: variable 'profile' from source: play vars 18823 1726855046.90894: variable 'profile' from source: play vars 18823 1726855046.90898: variable 'interface' from source: set_fact 18823 1726855046.90900: variable 'interface' from source: set_fact 18823 1726855046.91001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855046.91186: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855046.91238: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855046.91496: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855046.91499: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855046.91695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855046.91701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855046.91704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855046.91710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855046.91713: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855046.91846: variable 'network_connections' from source: play vars 18823 1726855046.91859: variable 'profile' from source: play vars 18823 1726855046.91945: variable 'profile' from source: play vars 18823 1726855046.91958: variable 'interface' from source: set_fact 18823 1726855046.92023: variable 'interface' from source: set_fact 18823 1726855046.92073: variable '__network_packages_default_wireless' from source: role '' defaults 18823 1726855046.92169: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855046.92498: variable 'network_connections' from source: play vars 18823 1726855046.92501: variable 'profile' from source: play vars 18823 1726855046.92586: variable 'profile' from source: play vars 18823 1726855046.92594: variable 'interface' from source: set_fact 18823 1726855046.92674: variable 'interface' from source: set_fact 18823 1726855046.92712: variable '__network_packages_default_team' from source: role '' defaults 18823 1726855046.92796: variable '__network_team_connections_defined' from source: role '' defaults 18823 1726855046.93101: variable 'network_connections' from source: play vars 18823 1726855046.93140: variable 'profile' from source: play vars 18823 1726855046.93174: variable 'profile' from source: play vars 18823 1726855046.93178: variable 'interface' from source: set_fact 18823 1726855046.93259: variable 'interface' from source: set_fact 18823 1726855046.93335: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855046.93396: variable '__network_service_name_default_initscripts' from source: role '' defaults 18823 1726855046.93402: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855046.93468: variable '__network_packages_default_initscripts' from source: role '' defaults 18823 1726855046.93794: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18823 1726855046.94227: variable 'network_connections' from source: play vars 18823 1726855046.94236: variable 'profile' from source: play vars 18823 1726855046.94302: variable 'profile' from source: play vars 18823 1726855046.94305: variable 'interface' from source: set_fact 18823 1726855046.94382: variable 'interface' from source: set_fact 18823 1726855046.94391: variable 'ansible_distribution' from source: facts 18823 1726855046.94396: variable '__network_rh_distros' from source: role '' defaults 18823 1726855046.94402: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.94420: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18823 1726855046.94644: variable 'ansible_distribution' from source: facts 18823 1726855046.94647: variable '__network_rh_distros' from source: role '' defaults 18823 1726855046.94660: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.94670: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18823 1726855046.94889: variable 'ansible_distribution' from source: facts 18823 1726855046.94893: variable '__network_rh_distros' from source: role '' defaults 18823 1726855046.94898: variable 'ansible_distribution_major_version' from source: facts 18823 1726855046.94935: variable 'network_provider' from source: set_fact 18823 1726855046.94986: variable 'omit' from source: magic vars 18823 1726855046.94991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855046.95020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855046.95038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855046.95100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855046.95104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855046.95106: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855046.95109: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.95116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.95230: Set connection var ansible_timeout to 10 18823 1726855046.95313: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855046.95316: Set connection var ansible_shell_type to sh 18823 1726855046.95318: Set connection var ansible_shell_executable to /bin/sh 18823 1726855046.95321: Set connection var ansible_connection to ssh 18823 1726855046.95323: Set connection var ansible_pipelining to False 18823 1726855046.95324: variable 'ansible_shell_executable' from source: unknown 18823 1726855046.95327: variable 'ansible_connection' from source: unknown 18823 1726855046.95329: variable 'ansible_module_compression' from source: unknown 18823 1726855046.95331: variable 'ansible_shell_type' from source: unknown 18823 1726855046.95333: variable 'ansible_shell_executable' from source: unknown 18823 1726855046.95335: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855046.95340: variable 'ansible_pipelining' from source: unknown 18823 1726855046.95343: variable 'ansible_timeout' from source: unknown 18823 1726855046.95345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855046.95422: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855046.95429: variable 'omit' from source: magic vars 18823 1726855046.95436: starting attempt loop 18823 1726855046.95439: running the handler 18823 1726855046.95535: variable 'ansible_facts' from source: unknown 18823 1726855046.96899: _low_level_execute_command(): starting 18823 1726855046.96904: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855046.98516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855046.98705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855046.98839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855046.98937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855047.00724: stdout chunk (state=3): >>>/root <<< 18823 1726855047.00851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855047.00855: stdout chunk (state=3): >>><<< 18823 1726855047.00858: stderr chunk (state=3): >>><<< 18823 1726855047.00895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855047.01045: _low_level_execute_command(): starting 18823 1726855047.01049: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464 `" && echo ansible-tmp-1726855047.008807-20640-254549749262464="` echo /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464 `" ) && sleep 0' 18823 1726855047.02566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855047.02589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855047.02795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.02801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855047.02804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.02806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855047.02808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855047.02905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855047.03073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855047.04995: stdout chunk (state=3): >>>ansible-tmp-1726855047.008807-20640-254549749262464=/root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464 <<< 18823 1726855047.05142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855047.05193: stderr chunk (state=3): >>><<< 18823 1726855047.05196: stdout chunk (state=3): >>><<< 18823 1726855047.05245: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855047.008807-20640-254549749262464=/root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855047.05299: variable 'ansible_module_compression' from source: unknown 18823 1726855047.05366: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18823 1726855047.05603: variable 'ansible_facts' from source: unknown 18823 1726855047.05701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/AnsiballZ_systemd.py 18823 1726855047.06007: Sending initial data 18823 1726855047.06016: Sent initial data (155 bytes) 18823 1726855047.06585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855047.06627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855047.06663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.06672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855047.06684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855047.06719: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855047.06770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.06819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855047.06844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855047.06862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855047.06953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855047.08563: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855047.08645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855047.08741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp65y7k8we /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/AnsiballZ_systemd.py <<< 18823 1726855047.08745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/AnsiballZ_systemd.py" <<< 18823 1726855047.08824: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp65y7k8we" to remote "/root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/AnsiballZ_systemd.py" <<< 18823 1726855047.10806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855047.10976: stderr chunk (state=3): >>><<< 18823 1726855047.10979: stdout chunk (state=3): >>><<< 18823 1726855047.11071: done transferring module to remote 18823 1726855047.11075: _low_level_execute_command(): starting 18823 1726855047.11077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/ /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/AnsiballZ_systemd.py && sleep 0' 18823 1726855047.11696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855047.11724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855047.11737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855047.11741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855047.11743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855047.11830: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855047.11834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.11838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855047.11840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855047.11842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855047.11844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855047.11846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855047.11848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855047.11850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855047.11851: stderr chunk (state=3): >>>debug2: match found <<< 18823 1726855047.11853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.11932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855047.11935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855047.11957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855047.12058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855047.13864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855047.13920: stderr chunk (state=3): >>><<< 18823 1726855047.13923: stdout chunk (state=3): >>><<< 18823 1726855047.13938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855047.13941: _low_level_execute_command(): starting 18823 1726855047.13946: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/AnsiballZ_systemd.py && sleep 0' 18823 1726855047.14359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855047.14392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855047.14395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855047.14398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.14400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855047.14402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.14455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855047.14460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855047.14536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855047.43695: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4562944", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317882880", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1186234000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 18823 1726855047.43701: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18823 1726855047.45602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855047.45633: stderr chunk (state=3): >>><<< 18823 1726855047.45636: stdout chunk (state=3): >>><<< 18823 1726855047.45652: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4562944", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317882880", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1186234000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855047.45768: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855047.45785: _low_level_execute_command(): starting 18823 1726855047.45791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855047.008807-20640-254549749262464/ > /dev/null 2>&1 && sleep 0' 18823 1726855047.46251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855047.46254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.46256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855047.46258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855047.46260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.46319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855047.46324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855047.46326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855047.46398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855047.48278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855047.48298: stderr chunk (state=3): >>><<< 18823 1726855047.48300: stdout chunk (state=3): >>><<< 18823 1726855047.48321: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855047.48324: handler run complete 18823 1726855047.48457: attempt loop complete, returning result 18823 1726855047.48460: _execute() done 18823 1726855047.48462: dumping result to json 18823 1726855047.48464: done dumping result, returning 18823 1726855047.48466: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-d391-077c-000000000068] 18823 1726855047.48468: sending task result for task 0affcc66-ac2b-d391-077c-000000000068 18823 1726855047.48763: done sending task result for task 0affcc66-ac2b-d391-077c-000000000068 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855047.48815: no more pending results, returning what we have 18823 1726855047.48818: results queue empty 18823 1726855047.48819: checking for any_errors_fatal 18823 1726855047.48825: done checking for any_errors_fatal 18823 1726855047.48826: checking for max_fail_percentage 18823 1726855047.48828: done checking for max_fail_percentage 18823 1726855047.48828: checking to see if all hosts have failed and the running result is not ok 18823 1726855047.48829: done checking to see if all hosts have failed 18823 1726855047.48830: getting the remaining hosts for this loop 18823 1726855047.48832: done getting the remaining hosts for this loop 18823 1726855047.48836: getting the next task for host managed_node2 18823 1726855047.48843: done getting next task for host managed_node2 18823 1726855047.48846: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18823 1726855047.48848: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855047.48860: WORKER PROCESS EXITING 18823 1726855047.48864: getting variables 18823 1726855047.48865: in VariableManager get_vars() 18823 1726855047.48933: Calling all_inventory to load vars for managed_node2 18823 1726855047.48945: Calling groups_inventory to load vars for managed_node2 18823 1726855047.48948: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855047.48957: Calling all_plugins_play to load vars for managed_node2 18823 1726855047.48960: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855047.48962: Calling groups_plugins_play to load vars for managed_node2 18823 1726855047.50531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855047.52173: done with get_vars() 18823 1726855047.52204: done getting variables 18823 1726855047.52289: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:57:27 -0400 (0:00:00.691) 0:00:39.174 ****** 18823 1726855047.52327: entering _queue_task() for managed_node2/service 18823 1726855047.52672: worker is 1 (out of 1 available) 18823 1726855047.52686: exiting _queue_task() for managed_node2/service 18823 1726855047.52699: done queuing things up, now waiting for results queue to drain 18823 1726855047.52700: waiting for pending results... 18823 1726855047.52975: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18823 1726855047.53056: in run() - task 0affcc66-ac2b-d391-077c-000000000069 18823 1726855047.53069: variable 'ansible_search_path' from source: unknown 18823 1726855047.53073: variable 'ansible_search_path' from source: unknown 18823 1726855047.53109: calling self._execute() 18823 1726855047.53180: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855047.53184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855047.53197: variable 'omit' from source: magic vars 18823 1726855047.53540: variable 'ansible_distribution_major_version' from source: facts 18823 1726855047.53545: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855047.53763: variable 'network_provider' from source: set_fact 18823 1726855047.53767: Evaluated conditional (network_provider == "nm"): True 18823 1726855047.53770: variable '__network_wpa_supplicant_required' from source: role '' defaults 18823 1726855047.53791: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18823 1726855047.53970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855047.55972: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855047.56117: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855047.56121: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855047.56124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855047.56126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855047.56352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855047.56356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855047.56358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855047.56361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855047.56363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855047.56365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855047.56382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855047.56406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855047.56486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855047.56492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855047.56495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855047.56519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855047.56541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855047.56595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855047.56598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855047.56992: variable 'network_connections' from source: play vars 18823 1726855047.56996: variable 'profile' from source: play vars 18823 1726855047.56999: variable 'profile' from source: play vars 18823 1726855047.57001: variable 'interface' from source: set_fact 18823 1726855047.57003: variable 'interface' from source: set_fact 18823 1726855047.57005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18823 1726855047.57100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18823 1726855047.57138: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18823 1726855047.57167: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18823 1726855047.57195: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18823 1726855047.57238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18823 1726855047.57258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18823 1726855047.57282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855047.57310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18823 1726855047.57357: variable '__network_wireless_connections_defined' from source: role '' defaults 18823 1726855047.57610: variable 'network_connections' from source: play vars 18823 1726855047.57616: variable 'profile' from source: play vars 18823 1726855047.57674: variable 'profile' from source: play vars 18823 1726855047.57685: variable 'interface' from source: set_fact 18823 1726855047.57739: variable 'interface' from source: set_fact 18823 1726855047.57893: Evaluated conditional (__network_wpa_supplicant_required): False 18823 1726855047.57897: when evaluation is False, skipping this task 18823 1726855047.57899: _execute() done 18823 1726855047.57909: dumping result to json 18823 1726855047.57912: done dumping result, returning 18823 1726855047.57914: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-d391-077c-000000000069] 18823 1726855047.57915: sending task result for task 0affcc66-ac2b-d391-077c-000000000069 18823 1726855047.57979: done sending task result for task 0affcc66-ac2b-d391-077c-000000000069 18823 1726855047.57983: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18823 1726855047.58027: no more pending results, returning what we have 18823 1726855047.58030: results queue empty 18823 1726855047.58031: checking for any_errors_fatal 18823 1726855047.58046: done checking for any_errors_fatal 18823 1726855047.58047: checking for max_fail_percentage 18823 1726855047.58048: done checking for max_fail_percentage 18823 1726855047.58049: checking to see if all hosts have failed and the running result is not ok 18823 1726855047.58050: done checking to see if all hosts have failed 18823 1726855047.58050: getting the remaining hosts for this loop 18823 1726855047.58052: done getting the remaining hosts for this loop 18823 1726855047.58055: getting the next task for host managed_node2 18823 1726855047.58059: done getting next task for host managed_node2 18823 1726855047.58063: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18823 1726855047.58064: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855047.58076: getting variables 18823 1726855047.58077: in VariableManager get_vars() 18823 1726855047.58115: Calling all_inventory to load vars for managed_node2 18823 1726855047.58118: Calling groups_inventory to load vars for managed_node2 18823 1726855047.58120: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855047.58129: Calling all_plugins_play to load vars for managed_node2 18823 1726855047.58131: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855047.58134: Calling groups_plugins_play to load vars for managed_node2 18823 1726855047.60612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855047.62493: done with get_vars() 18823 1726855047.62527: done getting variables 18823 1726855047.62598: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:57:27 -0400 (0:00:00.103) 0:00:39.278 ****** 18823 1726855047.62638: entering _queue_task() for managed_node2/service 18823 1726855047.63041: worker is 1 (out of 1 available) 18823 1726855047.63168: exiting _queue_task() for managed_node2/service 18823 1726855047.63180: done queuing things up, now waiting for results queue to drain 18823 1726855047.63181: waiting for pending results... 18823 1726855047.63510: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 18823 1726855047.63514: in run() - task 0affcc66-ac2b-d391-077c-00000000006a 18823 1726855047.63517: variable 'ansible_search_path' from source: unknown 18823 1726855047.63520: variable 'ansible_search_path' from source: unknown 18823 1726855047.63537: calling self._execute() 18823 1726855047.63640: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855047.63651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855047.63666: variable 'omit' from source: magic vars 18823 1726855047.64152: variable 'ansible_distribution_major_version' from source: facts 18823 1726855047.64156: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855047.64210: variable 'network_provider' from source: set_fact 18823 1726855047.64222: Evaluated conditional (network_provider == "initscripts"): False 18823 1726855047.64228: when evaluation is False, skipping this task 18823 1726855047.64235: _execute() done 18823 1726855047.64242: dumping result to json 18823 1726855047.64255: done dumping result, returning 18823 1726855047.64270: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-d391-077c-00000000006a] 18823 1726855047.64279: sending task result for task 0affcc66-ac2b-d391-077c-00000000006a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18823 1726855047.64535: no more pending results, returning what we have 18823 1726855047.64540: results queue empty 18823 1726855047.64541: checking for any_errors_fatal 18823 1726855047.64549: done checking for any_errors_fatal 18823 1726855047.64550: checking for max_fail_percentage 18823 1726855047.64552: done checking for max_fail_percentage 18823 1726855047.64552: checking to see if all hosts have failed and the running result is not ok 18823 1726855047.64553: done checking to see if all hosts have failed 18823 1726855047.64554: getting the remaining hosts for this loop 18823 1726855047.64555: done getting the remaining hosts for this loop 18823 1726855047.64559: getting the next task for host managed_node2 18823 1726855047.64566: done getting next task for host managed_node2 18823 1726855047.64570: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18823 1726855047.64573: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855047.64599: getting variables 18823 1726855047.64601: in VariableManager get_vars() 18823 1726855047.64640: Calling all_inventory to load vars for managed_node2 18823 1726855047.64643: Calling groups_inventory to load vars for managed_node2 18823 1726855047.64645: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855047.64658: Calling all_plugins_play to load vars for managed_node2 18823 1726855047.64661: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855047.64664: Calling groups_plugins_play to load vars for managed_node2 18823 1726855047.65304: done sending task result for task 0affcc66-ac2b-d391-077c-00000000006a 18823 1726855047.65308: WORKER PROCESS EXITING 18823 1726855047.66303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855047.69376: done with get_vars() 18823 1726855047.69543: done getting variables 18823 1726855047.69604: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:57:27 -0400 (0:00:00.070) 0:00:39.348 ****** 18823 1726855047.69734: entering _queue_task() for managed_node2/copy 18823 1726855047.70462: worker is 1 (out of 1 available) 18823 1726855047.70476: exiting _queue_task() for managed_node2/copy 18823 1726855047.70610: done queuing things up, now waiting for results queue to drain 18823 1726855047.70611: waiting for pending results... 18823 1726855047.70963: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18823 1726855047.71063: in run() - task 0affcc66-ac2b-d391-077c-00000000006b 18823 1726855047.71081: variable 'ansible_search_path' from source: unknown 18823 1726855047.71085: variable 'ansible_search_path' from source: unknown 18823 1726855047.71331: calling self._execute() 18823 1726855047.71514: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855047.71637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855047.71646: variable 'omit' from source: magic vars 18823 1726855047.72628: variable 'ansible_distribution_major_version' from source: facts 18823 1726855047.72644: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855047.72784: variable 'network_provider' from source: set_fact 18823 1726855047.72798: Evaluated conditional (network_provider == "initscripts"): False 18823 1726855047.72819: when evaluation is False, skipping this task 18823 1726855047.72821: _execute() done 18823 1726855047.72824: dumping result to json 18823 1726855047.72826: done dumping result, returning 18823 1726855047.72894: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-d391-077c-00000000006b] 18823 1726855047.72897: sending task result for task 0affcc66-ac2b-d391-077c-00000000006b 18823 1726855047.73195: done sending task result for task 0affcc66-ac2b-d391-077c-00000000006b 18823 1726855047.73198: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18823 1726855047.73237: no more pending results, returning what we have 18823 1726855047.73241: results queue empty 18823 1726855047.73242: checking for any_errors_fatal 18823 1726855047.73247: done checking for any_errors_fatal 18823 1726855047.73248: checking for max_fail_percentage 18823 1726855047.73250: done checking for max_fail_percentage 18823 1726855047.73251: checking to see if all hosts have failed and the running result is not ok 18823 1726855047.73251: done checking to see if all hosts have failed 18823 1726855047.73252: getting the remaining hosts for this loop 18823 1726855047.73253: done getting the remaining hosts for this loop 18823 1726855047.73257: getting the next task for host managed_node2 18823 1726855047.73262: done getting next task for host managed_node2 18823 1726855047.73266: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18823 1726855047.73268: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855047.73282: getting variables 18823 1726855047.73284: in VariableManager get_vars() 18823 1726855047.73347: Calling all_inventory to load vars for managed_node2 18823 1726855047.73351: Calling groups_inventory to load vars for managed_node2 18823 1726855047.73353: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855047.73362: Calling all_plugins_play to load vars for managed_node2 18823 1726855047.73365: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855047.73367: Calling groups_plugins_play to load vars for managed_node2 18823 1726855047.75050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855047.76798: done with get_vars() 18823 1726855047.76832: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:57:27 -0400 (0:00:00.071) 0:00:39.420 ****** 18823 1726855047.76928: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18823 1726855047.77422: worker is 1 (out of 1 available) 18823 1726855047.77434: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18823 1726855047.77445: done queuing things up, now waiting for results queue to drain 18823 1726855047.77446: waiting for pending results... 18823 1726855047.77650: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18823 1726855047.77757: in run() - task 0affcc66-ac2b-d391-077c-00000000006c 18823 1726855047.77771: variable 'ansible_search_path' from source: unknown 18823 1726855047.77775: variable 'ansible_search_path' from source: unknown 18823 1726855047.77820: calling self._execute() 18823 1726855047.77924: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855047.77931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855047.77947: variable 'omit' from source: magic vars 18823 1726855047.78448: variable 'ansible_distribution_major_version' from source: facts 18823 1726855047.78452: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855047.78454: variable 'omit' from source: magic vars 18823 1726855047.78456: variable 'omit' from source: magic vars 18823 1726855047.78618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18823 1726855047.81708: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18823 1726855047.81794: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18823 1726855047.81839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18823 1726855047.81884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18823 1726855047.81992: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18823 1726855047.82022: variable 'network_provider' from source: set_fact 18823 1726855047.82175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18823 1726855047.82237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18823 1726855047.82262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18823 1726855047.82317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18823 1726855047.82331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18823 1726855047.82418: variable 'omit' from source: magic vars 18823 1726855047.82552: variable 'omit' from source: magic vars 18823 1726855047.82705: variable 'network_connections' from source: play vars 18823 1726855047.82708: variable 'profile' from source: play vars 18823 1726855047.82761: variable 'profile' from source: play vars 18823 1726855047.82764: variable 'interface' from source: set_fact 18823 1726855047.82831: variable 'interface' from source: set_fact 18823 1726855047.82994: variable 'omit' from source: magic vars 18823 1726855047.83003: variable '__lsr_ansible_managed' from source: task vars 18823 1726855047.83191: variable '__lsr_ansible_managed' from source: task vars 18823 1726855047.83398: Loaded config def from plugin (lookup/template) 18823 1726855047.83402: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18823 1726855047.83518: File lookup term: get_ansible_managed.j2 18823 1726855047.83528: variable 'ansible_search_path' from source: unknown 18823 1726855047.83532: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18823 1726855047.83535: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18823 1726855047.83538: variable 'ansible_search_path' from source: unknown 18823 1726855047.96174: variable 'ansible_managed' from source: unknown 18823 1726855047.96320: variable 'omit' from source: magic vars 18823 1726855047.96344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855047.96369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855047.96386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855047.96420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855047.96430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855047.96452: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855047.96456: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855047.96459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855047.96561: Set connection var ansible_timeout to 10 18823 1726855047.96567: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855047.96569: Set connection var ansible_shell_type to sh 18823 1726855047.96575: Set connection var ansible_shell_executable to /bin/sh 18823 1726855047.96674: Set connection var ansible_connection to ssh 18823 1726855047.96677: Set connection var ansible_pipelining to False 18823 1726855047.96679: variable 'ansible_shell_executable' from source: unknown 18823 1726855047.96681: variable 'ansible_connection' from source: unknown 18823 1726855047.96683: variable 'ansible_module_compression' from source: unknown 18823 1726855047.96685: variable 'ansible_shell_type' from source: unknown 18823 1726855047.96689: variable 'ansible_shell_executable' from source: unknown 18823 1726855047.96691: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855047.96693: variable 'ansible_pipelining' from source: unknown 18823 1726855047.96695: variable 'ansible_timeout' from source: unknown 18823 1726855047.96697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855047.96781: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855047.96796: variable 'omit' from source: magic vars 18823 1726855047.96799: starting attempt loop 18823 1726855047.96801: running the handler 18823 1726855047.96812: _low_level_execute_command(): starting 18823 1726855047.96817: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855047.97604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855047.97642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855047.97656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855047.97674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855047.97783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855047.99490: stdout chunk (state=3): >>>/root <<< 18823 1726855047.99708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855047.99711: stdout chunk (state=3): >>><<< 18823 1726855047.99714: stderr chunk (state=3): >>><<< 18823 1726855047.99716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855047.99718: _low_level_execute_command(): starting 18823 1726855047.99721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678 `" && echo ansible-tmp-1726855047.9966218-20672-93039559883678="` echo /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678 `" ) && sleep 0' 18823 1726855048.00277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855048.00281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.00347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.00351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855048.00354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855048.00356: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855048.00358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.00360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855048.00363: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855048.00365: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855048.00374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.00380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.00400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855048.00411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855048.00418: stderr chunk (state=3): >>>debug2: match found <<< 18823 1726855048.00428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.00502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855048.00506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.00537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.00629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.02536: stdout chunk (state=3): >>>ansible-tmp-1726855047.9966218-20672-93039559883678=/root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678 <<< 18823 1726855048.02643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.02667: stderr chunk (state=3): >>><<< 18823 1726855048.02670: stdout chunk (state=3): >>><<< 18823 1726855048.02686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855047.9966218-20672-93039559883678=/root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855048.02730: variable 'ansible_module_compression' from source: unknown 18823 1726855048.02761: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18823 1726855048.02800: variable 'ansible_facts' from source: unknown 18823 1726855048.02892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/AnsiballZ_network_connections.py 18823 1726855048.02991: Sending initial data 18823 1726855048.02994: Sent initial data (167 bytes) 18823 1726855048.03474: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.03478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.03552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.05595: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855048.05669: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855048.05718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpk0wpusit /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/AnsiballZ_network_connections.py <<< 18823 1726855048.05722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/AnsiballZ_network_connections.py" <<< 18823 1726855048.05782: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpk0wpusit" to remote "/root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/AnsiballZ_network_connections.py" <<< 18823 1726855048.07106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.07109: stdout chunk (state=3): >>><<< 18823 1726855048.07111: stderr chunk (state=3): >>><<< 18823 1726855048.07159: done transferring module to remote 18823 1726855048.07176: _low_level_execute_command(): starting 18823 1726855048.07269: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/ /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/AnsiballZ_network_connections.py && sleep 0' 18823 1726855048.07904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.07967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855048.07986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.08013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.08116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.09890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.09914: stderr chunk (state=3): >>><<< 18823 1726855048.09917: stdout chunk (state=3): >>><<< 18823 1726855048.09931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855048.09934: _low_level_execute_command(): starting 18823 1726855048.09937: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/AnsiballZ_network_connections.py && sleep 0' 18823 1726855048.10352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.10355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.10358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.10360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855048.10362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.10410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.10414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.10502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.37049: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lqmws4qo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lqmws4qo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/e1c880fd-fdb5-4526-932f-74f695fa6757: error=unknown <<< 18823 1726855048.37219: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18823 1726855048.39144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855048.39148: stdout chunk (state=3): >>><<< 18823 1726855048.39151: stderr chunk (state=3): >>><<< 18823 1726855048.39295: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lqmws4qo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lqmws4qo/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/e1c880fd-fdb5-4526-932f-74f695fa6757: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855048.39298: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855048.39300: _low_level_execute_command(): starting 18823 1726855048.39303: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855047.9966218-20672-93039559883678/ > /dev/null 2>&1 && sleep 0' 18823 1726855048.39929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855048.39945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.39973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.40078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.40101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855048.40125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.40149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.40257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.42396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.42400: stdout chunk (state=3): >>><<< 18823 1726855048.42402: stderr chunk (state=3): >>><<< 18823 1726855048.42407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855048.42415: handler run complete 18823 1726855048.42417: attempt loop complete, returning result 18823 1726855048.42419: _execute() done 18823 1726855048.42421: dumping result to json 18823 1726855048.42423: done dumping result, returning 18823 1726855048.42425: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-d391-077c-00000000006c] 18823 1726855048.42427: sending task result for task 0affcc66-ac2b-d391-077c-00000000006c 18823 1726855048.42507: done sending task result for task 0affcc66-ac2b-d391-077c-00000000006c 18823 1726855048.42511: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18823 1726855048.42615: no more pending results, returning what we have 18823 1726855048.42622: results queue empty 18823 1726855048.42623: checking for any_errors_fatal 18823 1726855048.42630: done checking for any_errors_fatal 18823 1726855048.42631: checking for max_fail_percentage 18823 1726855048.42632: done checking for max_fail_percentage 18823 1726855048.42633: checking to see if all hosts have failed and the running result is not ok 18823 1726855048.42634: done checking to see if all hosts have failed 18823 1726855048.42634: getting the remaining hosts for this loop 18823 1726855048.42636: done getting the remaining hosts for this loop 18823 1726855048.42640: getting the next task for host managed_node2 18823 1726855048.42647: done getting next task for host managed_node2 18823 1726855048.42651: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18823 1726855048.42653: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855048.42663: getting variables 18823 1726855048.42665: in VariableManager get_vars() 18823 1726855048.42818: Calling all_inventory to load vars for managed_node2 18823 1726855048.42821: Calling groups_inventory to load vars for managed_node2 18823 1726855048.42823: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855048.42833: Calling all_plugins_play to load vars for managed_node2 18823 1726855048.42836: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855048.42838: Calling groups_plugins_play to load vars for managed_node2 18823 1726855048.44666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855048.46515: done with get_vars() 18823 1726855048.46541: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:57:28 -0400 (0:00:00.696) 0:00:40.117 ****** 18823 1726855048.46627: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18823 1726855048.46960: worker is 1 (out of 1 available) 18823 1726855048.46972: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18823 1726855048.46983: done queuing things up, now waiting for results queue to drain 18823 1726855048.46983: waiting for pending results... 18823 1726855048.47462: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 18823 1726855048.47467: in run() - task 0affcc66-ac2b-d391-077c-00000000006d 18823 1726855048.47470: variable 'ansible_search_path' from source: unknown 18823 1726855048.47473: variable 'ansible_search_path' from source: unknown 18823 1726855048.47475: calling self._execute() 18823 1726855048.47485: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.47492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.47507: variable 'omit' from source: magic vars 18823 1726855048.47870: variable 'ansible_distribution_major_version' from source: facts 18823 1726855048.47883: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855048.48006: variable 'network_state' from source: role '' defaults 18823 1726855048.48012: Evaluated conditional (network_state != {}): False 18823 1726855048.48015: when evaluation is False, skipping this task 18823 1726855048.48017: _execute() done 18823 1726855048.48020: dumping result to json 18823 1726855048.48025: done dumping result, returning 18823 1726855048.48032: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-d391-077c-00000000006d] 18823 1726855048.48037: sending task result for task 0affcc66-ac2b-d391-077c-00000000006d 18823 1726855048.48233: done sending task result for task 0affcc66-ac2b-d391-077c-00000000006d 18823 1726855048.48236: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18823 1726855048.48285: no more pending results, returning what we have 18823 1726855048.48290: results queue empty 18823 1726855048.48292: checking for any_errors_fatal 18823 1726855048.48301: done checking for any_errors_fatal 18823 1726855048.48302: checking for max_fail_percentage 18823 1726855048.48304: done checking for max_fail_percentage 18823 1726855048.48304: checking to see if all hosts have failed and the running result is not ok 18823 1726855048.48305: done checking to see if all hosts have failed 18823 1726855048.48306: getting the remaining hosts for this loop 18823 1726855048.48307: done getting the remaining hosts for this loop 18823 1726855048.48311: getting the next task for host managed_node2 18823 1726855048.48317: done getting next task for host managed_node2 18823 1726855048.48321: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18823 1726855048.48323: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855048.48337: getting variables 18823 1726855048.48339: in VariableManager get_vars() 18823 1726855048.48375: Calling all_inventory to load vars for managed_node2 18823 1726855048.48378: Calling groups_inventory to load vars for managed_node2 18823 1726855048.48380: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855048.48494: Calling all_plugins_play to load vars for managed_node2 18823 1726855048.48498: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855048.48502: Calling groups_plugins_play to load vars for managed_node2 18823 1726855048.54175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855048.55744: done with get_vars() 18823 1726855048.55767: done getting variables 18823 1726855048.55816: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:57:28 -0400 (0:00:00.092) 0:00:40.210 ****** 18823 1726855048.55840: entering _queue_task() for managed_node2/debug 18823 1726855048.56167: worker is 1 (out of 1 available) 18823 1726855048.56179: exiting _queue_task() for managed_node2/debug 18823 1726855048.56192: done queuing things up, now waiting for results queue to drain 18823 1726855048.56193: waiting for pending results... 18823 1726855048.56546: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18823 1726855048.56572: in run() - task 0affcc66-ac2b-d391-077c-00000000006e 18823 1726855048.56614: variable 'ansible_search_path' from source: unknown 18823 1726855048.56619: variable 'ansible_search_path' from source: unknown 18823 1726855048.56641: calling self._execute() 18823 1726855048.56732: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.56736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.56747: variable 'omit' from source: magic vars 18823 1726855048.57128: variable 'ansible_distribution_major_version' from source: facts 18823 1726855048.57138: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855048.57149: variable 'omit' from source: magic vars 18823 1726855048.57190: variable 'omit' from source: magic vars 18823 1726855048.57227: variable 'omit' from source: magic vars 18823 1726855048.57271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855048.57307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855048.57325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855048.57340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855048.57350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855048.57381: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855048.57385: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.57390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.57479: Set connection var ansible_timeout to 10 18823 1726855048.57515: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855048.57518: Set connection var ansible_shell_type to sh 18823 1726855048.57521: Set connection var ansible_shell_executable to /bin/sh 18823 1726855048.57523: Set connection var ansible_connection to ssh 18823 1726855048.57526: Set connection var ansible_pipelining to False 18823 1726855048.57528: variable 'ansible_shell_executable' from source: unknown 18823 1726855048.57530: variable 'ansible_connection' from source: unknown 18823 1726855048.57533: variable 'ansible_module_compression' from source: unknown 18823 1726855048.57535: variable 'ansible_shell_type' from source: unknown 18823 1726855048.57538: variable 'ansible_shell_executable' from source: unknown 18823 1726855048.57540: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.57593: variable 'ansible_pipelining' from source: unknown 18823 1726855048.57597: variable 'ansible_timeout' from source: unknown 18823 1726855048.57599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.57765: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855048.57769: variable 'omit' from source: magic vars 18823 1726855048.57772: starting attempt loop 18823 1726855048.57775: running the handler 18823 1726855048.57846: variable '__network_connections_result' from source: set_fact 18823 1726855048.57900: handler run complete 18823 1726855048.57953: attempt loop complete, returning result 18823 1726855048.57956: _execute() done 18823 1726855048.57959: dumping result to json 18823 1726855048.57961: done dumping result, returning 18823 1726855048.57964: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-d391-077c-00000000006e] 18823 1726855048.57967: sending task result for task 0affcc66-ac2b-d391-077c-00000000006e 18823 1726855048.58135: done sending task result for task 0affcc66-ac2b-d391-077c-00000000006e 18823 1726855048.58164: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 18823 1726855048.58223: no more pending results, returning what we have 18823 1726855048.58227: results queue empty 18823 1726855048.58228: checking for any_errors_fatal 18823 1726855048.58235: done checking for any_errors_fatal 18823 1726855048.58236: checking for max_fail_percentage 18823 1726855048.58237: done checking for max_fail_percentage 18823 1726855048.58238: checking to see if all hosts have failed and the running result is not ok 18823 1726855048.58239: done checking to see if all hosts have failed 18823 1726855048.58239: getting the remaining hosts for this loop 18823 1726855048.58241: done getting the remaining hosts for this loop 18823 1726855048.58244: getting the next task for host managed_node2 18823 1726855048.58250: done getting next task for host managed_node2 18823 1726855048.58254: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18823 1726855048.58256: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855048.58266: getting variables 18823 1726855048.58268: in VariableManager get_vars() 18823 1726855048.58306: Calling all_inventory to load vars for managed_node2 18823 1726855048.58309: Calling groups_inventory to load vars for managed_node2 18823 1726855048.58311: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855048.58322: Calling all_plugins_play to load vars for managed_node2 18823 1726855048.58326: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855048.58329: Calling groups_plugins_play to load vars for managed_node2 18823 1726855048.60008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855048.61626: done with get_vars() 18823 1726855048.61649: done getting variables 18823 1726855048.61713: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:57:28 -0400 (0:00:00.059) 0:00:40.269 ****** 18823 1726855048.61744: entering _queue_task() for managed_node2/debug 18823 1726855048.62214: worker is 1 (out of 1 available) 18823 1726855048.62224: exiting _queue_task() for managed_node2/debug 18823 1726855048.62233: done queuing things up, now waiting for results queue to drain 18823 1726855048.62234: waiting for pending results... 18823 1726855048.62621: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18823 1726855048.62724: in run() - task 0affcc66-ac2b-d391-077c-00000000006f 18823 1726855048.62737: variable 'ansible_search_path' from source: unknown 18823 1726855048.62740: variable 'ansible_search_path' from source: unknown 18823 1726855048.62775: calling self._execute() 18823 1726855048.62873: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.62881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.62921: variable 'omit' from source: magic vars 18823 1726855048.63295: variable 'ansible_distribution_major_version' from source: facts 18823 1726855048.63494: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855048.63499: variable 'omit' from source: magic vars 18823 1726855048.63502: variable 'omit' from source: magic vars 18823 1726855048.63507: variable 'omit' from source: magic vars 18823 1726855048.63509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855048.63512: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855048.63515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855048.63517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855048.63593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855048.63597: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855048.63600: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.63603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.63795: Set connection var ansible_timeout to 10 18823 1726855048.63799: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855048.63802: Set connection var ansible_shell_type to sh 18823 1726855048.63807: Set connection var ansible_shell_executable to /bin/sh 18823 1726855048.63809: Set connection var ansible_connection to ssh 18823 1726855048.63812: Set connection var ansible_pipelining to False 18823 1726855048.63814: variable 'ansible_shell_executable' from source: unknown 18823 1726855048.63816: variable 'ansible_connection' from source: unknown 18823 1726855048.63819: variable 'ansible_module_compression' from source: unknown 18823 1726855048.63822: variable 'ansible_shell_type' from source: unknown 18823 1726855048.63824: variable 'ansible_shell_executable' from source: unknown 18823 1726855048.63826: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.63828: variable 'ansible_pipelining' from source: unknown 18823 1726855048.63830: variable 'ansible_timeout' from source: unknown 18823 1726855048.63832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.63875: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855048.63885: variable 'omit' from source: magic vars 18823 1726855048.63893: starting attempt loop 18823 1726855048.63898: running the handler 18823 1726855048.64128: variable '__network_connections_result' from source: set_fact 18823 1726855048.64132: variable '__network_connections_result' from source: set_fact 18823 1726855048.64135: handler run complete 18823 1726855048.64147: attempt loop complete, returning result 18823 1726855048.64150: _execute() done 18823 1726855048.64153: dumping result to json 18823 1726855048.64156: done dumping result, returning 18823 1726855048.64159: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-d391-077c-00000000006f] 18823 1726855048.64162: sending task result for task 0affcc66-ac2b-d391-077c-00000000006f 18823 1726855048.64323: done sending task result for task 0affcc66-ac2b-d391-077c-00000000006f 18823 1726855048.64327: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18823 1726855048.64609: no more pending results, returning what we have 18823 1726855048.64613: results queue empty 18823 1726855048.64614: checking for any_errors_fatal 18823 1726855048.64618: done checking for any_errors_fatal 18823 1726855048.64619: checking for max_fail_percentage 18823 1726855048.64621: done checking for max_fail_percentage 18823 1726855048.64622: checking to see if all hosts have failed and the running result is not ok 18823 1726855048.64623: done checking to see if all hosts have failed 18823 1726855048.64623: getting the remaining hosts for this loop 18823 1726855048.64624: done getting the remaining hosts for this loop 18823 1726855048.64628: getting the next task for host managed_node2 18823 1726855048.64633: done getting next task for host managed_node2 18823 1726855048.64636: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18823 1726855048.64638: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855048.64652: getting variables 18823 1726855048.64654: in VariableManager get_vars() 18823 1726855048.64686: Calling all_inventory to load vars for managed_node2 18823 1726855048.64695: Calling groups_inventory to load vars for managed_node2 18823 1726855048.64697: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855048.64707: Calling all_plugins_play to load vars for managed_node2 18823 1726855048.64710: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855048.64713: Calling groups_plugins_play to load vars for managed_node2 18823 1726855048.67303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855048.69297: done with get_vars() 18823 1726855048.69328: done getting variables 18823 1726855048.69407: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:57:28 -0400 (0:00:00.076) 0:00:40.346 ****** 18823 1726855048.69444: entering _queue_task() for managed_node2/debug 18823 1726855048.69803: worker is 1 (out of 1 available) 18823 1726855048.69815: exiting _queue_task() for managed_node2/debug 18823 1726855048.69828: done queuing things up, now waiting for results queue to drain 18823 1726855048.69829: waiting for pending results... 18823 1726855048.70176: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18823 1726855048.70231: in run() - task 0affcc66-ac2b-d391-077c-000000000070 18823 1726855048.70270: variable 'ansible_search_path' from source: unknown 18823 1726855048.70278: variable 'ansible_search_path' from source: unknown 18823 1726855048.70378: calling self._execute() 18823 1726855048.70384: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.70388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.70394: variable 'omit' from source: magic vars 18823 1726855048.70779: variable 'ansible_distribution_major_version' from source: facts 18823 1726855048.70791: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855048.70914: variable 'network_state' from source: role '' defaults 18823 1726855048.70924: Evaluated conditional (network_state != {}): False 18823 1726855048.70927: when evaluation is False, skipping this task 18823 1726855048.70932: _execute() done 18823 1726855048.70935: dumping result to json 18823 1726855048.70937: done dumping result, returning 18823 1726855048.70943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-d391-077c-000000000070] 18823 1726855048.70946: sending task result for task 0affcc66-ac2b-d391-077c-000000000070 18823 1726855048.71112: done sending task result for task 0affcc66-ac2b-d391-077c-000000000070 18823 1726855048.71116: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 18823 1726855048.71168: no more pending results, returning what we have 18823 1726855048.71172: results queue empty 18823 1726855048.71173: checking for any_errors_fatal 18823 1726855048.71181: done checking for any_errors_fatal 18823 1726855048.71182: checking for max_fail_percentage 18823 1726855048.71184: done checking for max_fail_percentage 18823 1726855048.71185: checking to see if all hosts have failed and the running result is not ok 18823 1726855048.71186: done checking to see if all hosts have failed 18823 1726855048.71186: getting the remaining hosts for this loop 18823 1726855048.71190: done getting the remaining hosts for this loop 18823 1726855048.71194: getting the next task for host managed_node2 18823 1726855048.71202: done getting next task for host managed_node2 18823 1726855048.71206: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18823 1726855048.71208: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855048.71223: getting variables 18823 1726855048.71225: in VariableManager get_vars() 18823 1726855048.71263: Calling all_inventory to load vars for managed_node2 18823 1726855048.71267: Calling groups_inventory to load vars for managed_node2 18823 1726855048.71270: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855048.71286: Calling all_plugins_play to load vars for managed_node2 18823 1726855048.71343: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855048.71347: Calling groups_plugins_play to load vars for managed_node2 18823 1726855048.73428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855048.74777: done with get_vars() 18823 1726855048.74795: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:57:28 -0400 (0:00:00.054) 0:00:40.400 ****** 18823 1726855048.74863: entering _queue_task() for managed_node2/ping 18823 1726855048.75107: worker is 1 (out of 1 available) 18823 1726855048.75121: exiting _queue_task() for managed_node2/ping 18823 1726855048.75135: done queuing things up, now waiting for results queue to drain 18823 1726855048.75136: waiting for pending results... 18823 1726855048.75318: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 18823 1726855048.75393: in run() - task 0affcc66-ac2b-d391-077c-000000000071 18823 1726855048.75405: variable 'ansible_search_path' from source: unknown 18823 1726855048.75410: variable 'ansible_search_path' from source: unknown 18823 1726855048.75450: calling self._execute() 18823 1726855048.75793: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.75797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.75800: variable 'omit' from source: magic vars 18823 1726855048.75955: variable 'ansible_distribution_major_version' from source: facts 18823 1726855048.75958: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855048.75961: variable 'omit' from source: magic vars 18823 1726855048.75995: variable 'omit' from source: magic vars 18823 1726855048.76032: variable 'omit' from source: magic vars 18823 1726855048.76210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855048.76264: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855048.76296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855048.76321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855048.76338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855048.76391: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855048.76583: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.76688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.76895: Set connection var ansible_timeout to 10 18823 1726855048.76898: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855048.76901: Set connection var ansible_shell_type to sh 18823 1726855048.76903: Set connection var ansible_shell_executable to /bin/sh 18823 1726855048.76904: Set connection var ansible_connection to ssh 18823 1726855048.76906: Set connection var ansible_pipelining to False 18823 1726855048.76910: variable 'ansible_shell_executable' from source: unknown 18823 1726855048.76913: variable 'ansible_connection' from source: unknown 18823 1726855048.76915: variable 'ansible_module_compression' from source: unknown 18823 1726855048.76917: variable 'ansible_shell_type' from source: unknown 18823 1726855048.76919: variable 'ansible_shell_executable' from source: unknown 18823 1726855048.76920: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855048.76922: variable 'ansible_pipelining' from source: unknown 18823 1726855048.76924: variable 'ansible_timeout' from source: unknown 18823 1726855048.76926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855048.77263: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855048.77329: variable 'omit' from source: magic vars 18823 1726855048.77332: starting attempt loop 18823 1726855048.77335: running the handler 18823 1726855048.77337: _low_level_execute_command(): starting 18823 1726855048.77340: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855048.77836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.77840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.77843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855048.77846: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.77893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.77898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.77976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.79779: stdout chunk (state=3): >>>/root <<< 18823 1726855048.79870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.79874: stdout chunk (state=3): >>><<< 18823 1726855048.79949: stderr chunk (state=3): >>><<< 18823 1726855048.79954: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855048.80062: _low_level_execute_command(): starting 18823 1726855048.80070: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900 `" && echo ansible-tmp-1726855048.7991247-20719-159306280598900="` echo /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900 `" ) && sleep 0' 18823 1726855048.81014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855048.81032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.81037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.81046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855048.81064: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855048.81076: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855048.81078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.81141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855048.81144: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855048.81146: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855048.81197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.81202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855048.81212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.81251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.81324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.83241: stdout chunk (state=3): >>>ansible-tmp-1726855048.7991247-20719-159306280598900=/root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900 <<< 18823 1726855048.83406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.83409: stdout chunk (state=3): >>><<< 18823 1726855048.83412: stderr chunk (state=3): >>><<< 18823 1726855048.83723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855048.7991247-20719-159306280598900=/root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855048.83727: variable 'ansible_module_compression' from source: unknown 18823 1726855048.83730: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18823 1726855048.83732: variable 'ansible_facts' from source: unknown 18823 1726855048.83777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/AnsiballZ_ping.py 18823 1726855048.84057: Sending initial data 18823 1726855048.84136: Sent initial data (153 bytes) 18823 1726855048.84665: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855048.84676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.84711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855048.84722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.84771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.84801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.84817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.84919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.86491: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855048.86551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855048.86621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpc31sivs9 /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/AnsiballZ_ping.py <<< 18823 1726855048.86623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/AnsiballZ_ping.py" <<< 18823 1726855048.86695: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpc31sivs9" to remote "/root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/AnsiballZ_ping.py" <<< 18823 1726855048.86703: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/AnsiballZ_ping.py" <<< 18823 1726855048.87333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.87367: stderr chunk (state=3): >>><<< 18823 1726855048.87370: stdout chunk (state=3): >>><<< 18823 1726855048.87396: done transferring module to remote 18823 1726855048.87408: _low_level_execute_command(): starting 18823 1726855048.87411: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/ /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/AnsiballZ_ping.py && sleep 0' 18823 1726855048.87839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.87842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855048.87845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.87850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855048.87852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855048.87854: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.87893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855048.87904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.87916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.87997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855048.89740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855048.89763: stderr chunk (state=3): >>><<< 18823 1726855048.89767: stdout chunk (state=3): >>><<< 18823 1726855048.89778: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855048.89782: _low_level_execute_command(): starting 18823 1726855048.89786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/AnsiballZ_ping.py && sleep 0' 18823 1726855048.90217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.90221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855048.90223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855048.90225: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855048.90227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855048.90277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855048.90280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855048.90284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855048.90359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855049.05436: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18823 1726855049.06613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855049.06628: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 18823 1726855049.06685: stderr chunk (state=3): >>><<< 18823 1726855049.07111: stdout chunk (state=3): >>><<< 18823 1726855049.07115: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855049.07118: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855049.07122: _low_level_execute_command(): starting 18823 1726855049.07124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855048.7991247-20719-159306280598900/ > /dev/null 2>&1 && sleep 0' 18823 1726855049.08344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855049.08485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855049.08576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855049.08662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855049.08712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855049.08860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855049.10997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855049.11002: stdout chunk (state=3): >>><<< 18823 1726855049.11008: stderr chunk (state=3): >>><<< 18823 1726855049.11010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855049.11012: handler run complete 18823 1726855049.11014: attempt loop complete, returning result 18823 1726855049.11016: _execute() done 18823 1726855049.11018: dumping result to json 18823 1726855049.11019: done dumping result, returning 18823 1726855049.11021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-d391-077c-000000000071] 18823 1726855049.11023: sending task result for task 0affcc66-ac2b-d391-077c-000000000071 ok: [managed_node2] => { "changed": false, "ping": "pong" } 18823 1726855049.11325: no more pending results, returning what we have 18823 1726855049.11329: results queue empty 18823 1726855049.11330: checking for any_errors_fatal 18823 1726855049.11339: done checking for any_errors_fatal 18823 1726855049.11340: checking for max_fail_percentage 18823 1726855049.11342: done checking for max_fail_percentage 18823 1726855049.11342: checking to see if all hosts have failed and the running result is not ok 18823 1726855049.11343: done checking to see if all hosts have failed 18823 1726855049.11344: getting the remaining hosts for this loop 18823 1726855049.11345: done getting the remaining hosts for this loop 18823 1726855049.11350: getting the next task for host managed_node2 18823 1726855049.11362: done getting next task for host managed_node2 18823 1726855049.11364: ^ task is: TASK: meta (role_complete) 18823 1726855049.11366: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855049.11377: getting variables 18823 1726855049.11379: in VariableManager get_vars() 18823 1726855049.11924: Calling all_inventory to load vars for managed_node2 18823 1726855049.11928: Calling groups_inventory to load vars for managed_node2 18823 1726855049.11931: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855049.11942: Calling all_plugins_play to load vars for managed_node2 18823 1726855049.11945: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855049.11949: Calling groups_plugins_play to load vars for managed_node2 18823 1726855049.12510: done sending task result for task 0affcc66-ac2b-d391-077c-000000000071 18823 1726855049.12514: WORKER PROCESS EXITING 18823 1726855049.14620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855049.16918: done with get_vars() 18823 1726855049.16947: done getting variables 18823 1726855049.17153: done queuing things up, now waiting for results queue to drain 18823 1726855049.17156: results queue empty 18823 1726855049.17157: checking for any_errors_fatal 18823 1726855049.17160: done checking for any_errors_fatal 18823 1726855049.17161: checking for max_fail_percentage 18823 1726855049.17162: done checking for max_fail_percentage 18823 1726855049.17163: checking to see if all hosts have failed and the running result is not ok 18823 1726855049.17164: done checking to see if all hosts have failed 18823 1726855049.17164: getting the remaining hosts for this loop 18823 1726855049.17165: done getting the remaining hosts for this loop 18823 1726855049.17168: getting the next task for host managed_node2 18823 1726855049.17172: done getting next task for host managed_node2 18823 1726855049.17174: ^ task is: TASK: meta (flush_handlers) 18823 1726855049.17175: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855049.17178: getting variables 18823 1726855049.17179: in VariableManager get_vars() 18823 1726855049.17301: Calling all_inventory to load vars for managed_node2 18823 1726855049.17307: Calling groups_inventory to load vars for managed_node2 18823 1726855049.17310: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855049.17315: Calling all_plugins_play to load vars for managed_node2 18823 1726855049.17318: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855049.17320: Calling groups_plugins_play to load vars for managed_node2 18823 1726855049.19238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855049.22222: done with get_vars() 18823 1726855049.22248: done getting variables 18823 1726855049.22418: in VariableManager get_vars() 18823 1726855049.22431: Calling all_inventory to load vars for managed_node2 18823 1726855049.22433: Calling groups_inventory to load vars for managed_node2 18823 1726855049.22435: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855049.22439: Calling all_plugins_play to load vars for managed_node2 18823 1726855049.22441: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855049.22443: Calling groups_plugins_play to load vars for managed_node2 18823 1726855049.24759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855049.28308: done with get_vars() 18823 1726855049.28351: done queuing things up, now waiting for results queue to drain 18823 1726855049.28354: results queue empty 18823 1726855049.28355: checking for any_errors_fatal 18823 1726855049.28356: done checking for any_errors_fatal 18823 1726855049.28357: checking for max_fail_percentage 18823 1726855049.28358: done checking for max_fail_percentage 18823 1726855049.28359: checking to see if all hosts have failed and the running result is not ok 18823 1726855049.28360: done checking to see if all hosts have failed 18823 1726855049.28361: getting the remaining hosts for this loop 18823 1726855049.28362: done getting the remaining hosts for this loop 18823 1726855049.28480: getting the next task for host managed_node2 18823 1726855049.28489: done getting next task for host managed_node2 18823 1726855049.28496: ^ task is: TASK: meta (flush_handlers) 18823 1726855049.28497: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855049.28501: getting variables 18823 1726855049.28502: in VariableManager get_vars() 18823 1726855049.28521: Calling all_inventory to load vars for managed_node2 18823 1726855049.28524: Calling groups_inventory to load vars for managed_node2 18823 1726855049.28526: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855049.28532: Calling all_plugins_play to load vars for managed_node2 18823 1726855049.28534: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855049.28537: Calling groups_plugins_play to load vars for managed_node2 18823 1726855049.31329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855049.34837: done with get_vars() 18823 1726855049.34867: done getting variables 18823 1726855049.35008: in VariableManager get_vars() 18823 1726855049.35029: Calling all_inventory to load vars for managed_node2 18823 1726855049.35032: Calling groups_inventory to load vars for managed_node2 18823 1726855049.35034: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855049.35039: Calling all_plugins_play to load vars for managed_node2 18823 1726855049.35041: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855049.35044: Calling groups_plugins_play to load vars for managed_node2 18823 1726855049.37616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855049.40962: done with get_vars() 18823 1726855049.41069: done queuing things up, now waiting for results queue to drain 18823 1726855049.41072: results queue empty 18823 1726855049.41073: checking for any_errors_fatal 18823 1726855049.41075: done checking for any_errors_fatal 18823 1726855049.41075: checking for max_fail_percentage 18823 1726855049.41076: done checking for max_fail_percentage 18823 1726855049.41077: checking to see if all hosts have failed and the running result is not ok 18823 1726855049.41078: done checking to see if all hosts have failed 18823 1726855049.41078: getting the remaining hosts for this loop 18823 1726855049.41079: done getting the remaining hosts for this loop 18823 1726855049.41137: getting the next task for host managed_node2 18823 1726855049.41141: done getting next task for host managed_node2 18823 1726855049.41142: ^ task is: None 18823 1726855049.41143: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855049.41144: done queuing things up, now waiting for results queue to drain 18823 1726855049.41145: results queue empty 18823 1726855049.41146: checking for any_errors_fatal 18823 1726855049.41147: done checking for any_errors_fatal 18823 1726855049.41147: checking for max_fail_percentage 18823 1726855049.41148: done checking for max_fail_percentage 18823 1726855049.41149: checking to see if all hosts have failed and the running result is not ok 18823 1726855049.41149: done checking to see if all hosts have failed 18823 1726855049.41151: getting the next task for host managed_node2 18823 1726855049.41153: done getting next task for host managed_node2 18823 1726855049.41153: ^ task is: None 18823 1726855049.41154: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855049.41211: in VariableManager get_vars() 18823 1726855049.41228: done with get_vars() 18823 1726855049.41235: in VariableManager get_vars() 18823 1726855049.41244: done with get_vars() 18823 1726855049.41248: variable 'omit' from source: magic vars 18823 1726855049.41279: in VariableManager get_vars() 18823 1726855049.41292: done with get_vars() 18823 1726855049.41317: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 18823 1726855049.41575: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855049.41601: getting the remaining hosts for this loop 18823 1726855049.41602: done getting the remaining hosts for this loop 18823 1726855049.41607: getting the next task for host managed_node2 18823 1726855049.41610: done getting next task for host managed_node2 18823 1726855049.41612: ^ task is: TASK: Gathering Facts 18823 1726855049.41614: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855049.41616: getting variables 18823 1726855049.41617: in VariableManager get_vars() 18823 1726855049.41630: Calling all_inventory to load vars for managed_node2 18823 1726855049.41633: Calling groups_inventory to load vars for managed_node2 18823 1726855049.41635: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855049.41641: Calling all_plugins_play to load vars for managed_node2 18823 1726855049.41643: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855049.41646: Calling groups_plugins_play to load vars for managed_node2 18823 1726855049.43270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855049.44998: done with get_vars() 18823 1726855049.45029: done getting variables 18823 1726855049.45077: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 13:57:29 -0400 (0:00:00.702) 0:00:41.102 ****** 18823 1726855049.45114: entering _queue_task() for managed_node2/gather_facts 18823 1726855049.45491: worker is 1 (out of 1 available) 18823 1726855049.45502: exiting _queue_task() for managed_node2/gather_facts 18823 1726855049.45515: done queuing things up, now waiting for results queue to drain 18823 1726855049.45517: waiting for pending results... 18823 1726855049.45817: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855049.45919: in run() - task 0affcc66-ac2b-d391-077c-0000000004e4 18823 1726855049.45936: variable 'ansible_search_path' from source: unknown 18823 1726855049.45977: calling self._execute() 18823 1726855049.46078: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855049.46090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855049.46107: variable 'omit' from source: magic vars 18823 1726855049.46567: variable 'ansible_distribution_major_version' from source: facts 18823 1726855049.46572: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855049.46574: variable 'omit' from source: magic vars 18823 1726855049.46576: variable 'omit' from source: magic vars 18823 1726855049.46617: variable 'omit' from source: magic vars 18823 1726855049.46677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855049.46724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855049.46750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855049.46772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855049.46799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855049.46895: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855049.46898: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855049.46900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855049.46946: Set connection var ansible_timeout to 10 18823 1726855049.46956: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855049.46961: Set connection var ansible_shell_type to sh 18823 1726855049.46968: Set connection var ansible_shell_executable to /bin/sh 18823 1726855049.46975: Set connection var ansible_connection to ssh 18823 1726855049.46982: Set connection var ansible_pipelining to False 18823 1726855049.47021: variable 'ansible_shell_executable' from source: unknown 18823 1726855049.47029: variable 'ansible_connection' from source: unknown 18823 1726855049.47035: variable 'ansible_module_compression' from source: unknown 18823 1726855049.47042: variable 'ansible_shell_type' from source: unknown 18823 1726855049.47048: variable 'ansible_shell_executable' from source: unknown 18823 1726855049.47055: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855049.47114: variable 'ansible_pipelining' from source: unknown 18823 1726855049.47117: variable 'ansible_timeout' from source: unknown 18823 1726855049.47119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855049.47262: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855049.47276: variable 'omit' from source: magic vars 18823 1726855049.47284: starting attempt loop 18823 1726855049.47292: running the handler 18823 1726855049.47314: variable 'ansible_facts' from source: unknown 18823 1726855049.47343: _low_level_execute_command(): starting 18823 1726855049.47355: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855049.48217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855049.48244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855049.48269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855049.48283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855049.48396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855049.50155: stdout chunk (state=3): >>>/root <<< 18823 1726855049.50297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855049.50315: stderr chunk (state=3): >>><<< 18823 1726855049.50325: stdout chunk (state=3): >>><<< 18823 1726855049.50362: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855049.50406: _low_level_execute_command(): starting 18823 1726855049.50411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619 `" && echo ansible-tmp-1726855049.5036988-20756-131840937152619="` echo /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619 `" ) && sleep 0' 18823 1726855049.51101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855049.51104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855049.51108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855049.51111: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855049.51121: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855049.51179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855049.51183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855049.51185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855049.51276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855049.53173: stdout chunk (state=3): >>>ansible-tmp-1726855049.5036988-20756-131840937152619=/root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619 <<< 18823 1726855049.53327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855049.53331: stderr chunk (state=3): >>><<< 18823 1726855049.53333: stdout chunk (state=3): >>><<< 18823 1726855049.53342: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855049.5036988-20756-131840937152619=/root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855049.53494: variable 'ansible_module_compression' from source: unknown 18823 1726855049.53497: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855049.53511: variable 'ansible_facts' from source: unknown 18823 1726855049.53726: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/AnsiballZ_setup.py 18823 1726855049.53916: Sending initial data 18823 1726855049.53919: Sent initial data (154 bytes) 18823 1726855049.54699: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855049.54801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855049.54818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855049.54831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855049.54930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855049.56459: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18823 1726855049.56472: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855049.56536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855049.56596: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpve1asjul /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/AnsiballZ_setup.py <<< 18823 1726855049.56616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/AnsiballZ_setup.py" <<< 18823 1726855049.56682: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpve1asjul" to remote "/root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/AnsiballZ_setup.py" <<< 18823 1726855049.58481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855049.58485: stdout chunk (state=3): >>><<< 18823 1726855049.58496: stderr chunk (state=3): >>><<< 18823 1726855049.58502: done transferring module to remote 18823 1726855049.58504: _low_level_execute_command(): starting 18823 1726855049.58507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/ /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/AnsiballZ_setup.py && sleep 0' 18823 1726855049.59025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855049.59042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855049.59060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855049.59078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855049.59180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855049.59212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855049.59317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855049.61118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855049.61151: stderr chunk (state=3): >>><<< 18823 1726855049.61154: stdout chunk (state=3): >>><<< 18823 1726855049.61251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855049.61254: _low_level_execute_command(): starting 18823 1726855049.61257: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/AnsiballZ_setup.py && sleep 0' 18823 1726855049.61804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855049.61845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855049.61862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855049.61877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855049.61981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.26224: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 833, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794570240, "block_size": 4096, "block_total": 65519099, "block_available": 63914690, "block_used": 1604409, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH<<< 18823 1726855050.26274: stdout chunk (state=3): >>>6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "30", "epoch": "1726855050", "epoch_int": "1726855050", "date": "2024-09-20", "time": "13:57:30", "iso8601_micro": "2024-09-20T17:57:30.222574Z", "iso8601": "2024-09-20T17:57:30Z", "iso8601_basic": "20240920T135730222574", "iso8601_basic_short": "20240920T135730", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.5869140625, "5m": 0.43212890625, "15m": 0.22314453125}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855050.28190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855050.28208: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 18823 1726855050.28235: stderr chunk (state=3): >>><<< 18823 1726855050.28239: stdout chunk (state=3): >>><<< 18823 1726855050.28494: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 833, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794570240, "block_size": 4096, "block_total": 65519099, "block_available": 63914690, "block_used": 1604409, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "30", "epoch": "1726855050", "epoch_int": "1726855050", "date": "2024-09-20", "time": "13:57:30", "iso8601_micro": "2024-09-20T17:57:30.222574Z", "iso8601": "2024-09-20T17:57:30Z", "iso8601_basic": "20240920T135730222574", "iso8601_basic_short": "20240920T135730", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.5869140625, "5m": 0.43212890625, "15m": 0.22314453125}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855050.28673: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855050.28702: _low_level_execute_command(): starting 18823 1726855050.28715: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855049.5036988-20756-131840937152619/ > /dev/null 2>&1 && sleep 0' 18823 1726855050.29326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855050.29330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855050.29332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855050.29334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855050.29336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855050.29386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855050.29392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855050.29476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.31496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855050.31500: stdout chunk (state=3): >>><<< 18823 1726855050.31503: stderr chunk (state=3): >>><<< 18823 1726855050.31505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855050.31508: handler run complete 18823 1726855050.31523: variable 'ansible_facts' from source: unknown 18823 1726855050.31644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.31960: variable 'ansible_facts' from source: unknown 18823 1726855050.32058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.32194: attempt loop complete, returning result 18823 1726855050.32203: _execute() done 18823 1726855050.32212: dumping result to json 18823 1726855050.32245: done dumping result, returning 18823 1726855050.32265: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-0000000004e4] 18823 1726855050.32375: sending task result for task 0affcc66-ac2b-d391-077c-0000000004e4 ok: [managed_node2] 18823 1726855050.33005: no more pending results, returning what we have 18823 1726855050.33007: results queue empty 18823 1726855050.33008: checking for any_errors_fatal 18823 1726855050.33009: done checking for any_errors_fatal 18823 1726855050.33010: checking for max_fail_percentage 18823 1726855050.33011: done checking for max_fail_percentage 18823 1726855050.33011: checking to see if all hosts have failed and the running result is not ok 18823 1726855050.33012: done checking to see if all hosts have failed 18823 1726855050.33012: getting the remaining hosts for this loop 18823 1726855050.33013: done getting the remaining hosts for this loop 18823 1726855050.33016: getting the next task for host managed_node2 18823 1726855050.33020: done getting next task for host managed_node2 18823 1726855050.33021: ^ task is: TASK: meta (flush_handlers) 18823 1726855050.33022: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855050.33026: getting variables 18823 1726855050.33027: in VariableManager get_vars() 18823 1726855050.33045: Calling all_inventory to load vars for managed_node2 18823 1726855050.33049: Calling groups_inventory to load vars for managed_node2 18823 1726855050.33051: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.33060: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.33062: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.33065: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.33580: done sending task result for task 0affcc66-ac2b-d391-077c-0000000004e4 18823 1726855050.33584: WORKER PROCESS EXITING 18823 1726855050.33928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.35133: done with get_vars() 18823 1726855050.35152: done getting variables 18823 1726855050.35221: in VariableManager get_vars() 18823 1726855050.35233: Calling all_inventory to load vars for managed_node2 18823 1726855050.35236: Calling groups_inventory to load vars for managed_node2 18823 1726855050.35238: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.35245: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.35247: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.35249: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.36336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.37266: done with get_vars() 18823 1726855050.37291: done queuing things up, now waiting for results queue to drain 18823 1726855050.37293: results queue empty 18823 1726855050.37294: checking for any_errors_fatal 18823 1726855050.37296: done checking for any_errors_fatal 18823 1726855050.37297: checking for max_fail_percentage 18823 1726855050.37301: done checking for max_fail_percentage 18823 1726855050.37302: checking to see if all hosts have failed and the running result is not ok 18823 1726855050.37302: done checking to see if all hosts have failed 18823 1726855050.37303: getting the remaining hosts for this loop 18823 1726855050.37304: done getting the remaining hosts for this loop 18823 1726855050.37306: getting the next task for host managed_node2 18823 1726855050.37310: done getting next task for host managed_node2 18823 1726855050.37313: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 18823 1726855050.37314: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855050.37316: getting variables 18823 1726855050.37316: in VariableManager get_vars() 18823 1726855050.37324: Calling all_inventory to load vars for managed_node2 18823 1726855050.37325: Calling groups_inventory to load vars for managed_node2 18823 1726855050.37327: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.37331: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.37332: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.37334: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.38056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.39429: done with get_vars() 18823 1726855050.39444: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 13:57:30 -0400 (0:00:00.943) 0:00:42.046 ****** 18823 1726855050.39502: entering _queue_task() for managed_node2/include_tasks 18823 1726855050.39761: worker is 1 (out of 1 available) 18823 1726855050.39775: exiting _queue_task() for managed_node2/include_tasks 18823 1726855050.39789: done queuing things up, now waiting for results queue to drain 18823 1726855050.39790: waiting for pending results... 18823 1726855050.39972: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' 18823 1726855050.40046: in run() - task 0affcc66-ac2b-d391-077c-000000000074 18823 1726855050.40057: variable 'ansible_search_path' from source: unknown 18823 1726855050.40086: calling self._execute() 18823 1726855050.40158: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.40162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.40172: variable 'omit' from source: magic vars 18823 1726855050.40453: variable 'ansible_distribution_major_version' from source: facts 18823 1726855050.40470: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855050.40475: _execute() done 18823 1726855050.40480: dumping result to json 18823 1726855050.40483: done dumping result, returning 18823 1726855050.40491: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' [0affcc66-ac2b-d391-077c-000000000074] 18823 1726855050.40495: sending task result for task 0affcc66-ac2b-d391-077c-000000000074 18823 1726855050.40583: done sending task result for task 0affcc66-ac2b-d391-077c-000000000074 18823 1726855050.40585: WORKER PROCESS EXITING 18823 1726855050.40615: no more pending results, returning what we have 18823 1726855050.40619: in VariableManager get_vars() 18823 1726855050.40651: Calling all_inventory to load vars for managed_node2 18823 1726855050.40653: Calling groups_inventory to load vars for managed_node2 18823 1726855050.40656: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.40669: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.40672: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.40675: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.41472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.42540: done with get_vars() 18823 1726855050.42554: variable 'ansible_search_path' from source: unknown 18823 1726855050.42565: we have included files to process 18823 1726855050.42566: generating all_blocks data 18823 1726855050.42567: done generating all_blocks data 18823 1726855050.42567: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18823 1726855050.42568: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18823 1726855050.42569: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18823 1726855050.42680: in VariableManager get_vars() 18823 1726855050.42693: done with get_vars() 18823 1726855050.42767: done processing included file 18823 1726855050.42769: iterating over new_blocks loaded from include file 18823 1726855050.42770: in VariableManager get_vars() 18823 1726855050.42779: done with get_vars() 18823 1726855050.42780: filtering new block on tags 18823 1726855050.42792: done filtering new block on tags 18823 1726855050.42794: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 18823 1726855050.42797: extending task lists for all hosts with included blocks 18823 1726855050.42827: done extending task lists 18823 1726855050.42827: done processing included files 18823 1726855050.42828: results queue empty 18823 1726855050.42828: checking for any_errors_fatal 18823 1726855050.42829: done checking for any_errors_fatal 18823 1726855050.42830: checking for max_fail_percentage 18823 1726855050.42830: done checking for max_fail_percentage 18823 1726855050.42831: checking to see if all hosts have failed and the running result is not ok 18823 1726855050.42831: done checking to see if all hosts have failed 18823 1726855050.42832: getting the remaining hosts for this loop 18823 1726855050.42832: done getting the remaining hosts for this loop 18823 1726855050.42834: getting the next task for host managed_node2 18823 1726855050.42837: done getting next task for host managed_node2 18823 1726855050.42838: ^ task is: TASK: Include the task 'get_profile_stat.yml' 18823 1726855050.42840: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855050.42841: getting variables 18823 1726855050.42842: in VariableManager get_vars() 18823 1726855050.42847: Calling all_inventory to load vars for managed_node2 18823 1726855050.42849: Calling groups_inventory to load vars for managed_node2 18823 1726855050.42850: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.42854: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.42855: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.42857: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.43529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.45302: done with get_vars() 18823 1726855050.45327: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:57:30 -0400 (0:00:00.058) 0:00:42.105 ****** 18823 1726855050.45405: entering _queue_task() for managed_node2/include_tasks 18823 1726855050.45744: worker is 1 (out of 1 available) 18823 1726855050.45757: exiting _queue_task() for managed_node2/include_tasks 18823 1726855050.45769: done queuing things up, now waiting for results queue to drain 18823 1726855050.45770: waiting for pending results... 18823 1726855050.46167: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 18823 1726855050.46191: in run() - task 0affcc66-ac2b-d391-077c-0000000004f5 18823 1726855050.46206: variable 'ansible_search_path' from source: unknown 18823 1726855050.46209: variable 'ansible_search_path' from source: unknown 18823 1726855050.46234: calling self._execute() 18823 1726855050.46310: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.46314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.46323: variable 'omit' from source: magic vars 18823 1726855050.46612: variable 'ansible_distribution_major_version' from source: facts 18823 1726855050.46619: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855050.46626: _execute() done 18823 1726855050.46629: dumping result to json 18823 1726855050.46631: done dumping result, returning 18823 1726855050.46639: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-d391-077c-0000000004f5] 18823 1726855050.46644: sending task result for task 0affcc66-ac2b-d391-077c-0000000004f5 18823 1726855050.46734: done sending task result for task 0affcc66-ac2b-d391-077c-0000000004f5 18823 1726855050.46737: WORKER PROCESS EXITING 18823 1726855050.46770: no more pending results, returning what we have 18823 1726855050.46775: in VariableManager get_vars() 18823 1726855050.46813: Calling all_inventory to load vars for managed_node2 18823 1726855050.46816: Calling groups_inventory to load vars for managed_node2 18823 1726855050.46819: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.46833: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.46836: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.46838: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.47762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.49251: done with get_vars() 18823 1726855050.49271: variable 'ansible_search_path' from source: unknown 18823 1726855050.49273: variable 'ansible_search_path' from source: unknown 18823 1726855050.49312: we have included files to process 18823 1726855050.49314: generating all_blocks data 18823 1726855050.49316: done generating all_blocks data 18823 1726855050.49317: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18823 1726855050.49318: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18823 1726855050.49320: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18823 1726855050.50342: done processing included file 18823 1726855050.50344: iterating over new_blocks loaded from include file 18823 1726855050.50346: in VariableManager get_vars() 18823 1726855050.50359: done with get_vars() 18823 1726855050.50360: filtering new block on tags 18823 1726855050.50383: done filtering new block on tags 18823 1726855050.50385: in VariableManager get_vars() 18823 1726855050.50399: done with get_vars() 18823 1726855050.50400: filtering new block on tags 18823 1726855050.50420: done filtering new block on tags 18823 1726855050.50422: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 18823 1726855050.50427: extending task lists for all hosts with included blocks 18823 1726855050.50534: done extending task lists 18823 1726855050.50536: done processing included files 18823 1726855050.50536: results queue empty 18823 1726855050.50537: checking for any_errors_fatal 18823 1726855050.50540: done checking for any_errors_fatal 18823 1726855050.50541: checking for max_fail_percentage 18823 1726855050.50542: done checking for max_fail_percentage 18823 1726855050.50542: checking to see if all hosts have failed and the running result is not ok 18823 1726855050.50543: done checking to see if all hosts have failed 18823 1726855050.50544: getting the remaining hosts for this loop 18823 1726855050.50545: done getting the remaining hosts for this loop 18823 1726855050.50547: getting the next task for host managed_node2 18823 1726855050.50551: done getting next task for host managed_node2 18823 1726855050.50553: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 18823 1726855050.50556: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855050.50558: getting variables 18823 1726855050.50559: in VariableManager get_vars() 18823 1726855050.50622: Calling all_inventory to load vars for managed_node2 18823 1726855050.50625: Calling groups_inventory to load vars for managed_node2 18823 1726855050.50628: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.50633: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.50635: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.50638: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.51703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.53248: done with get_vars() 18823 1726855050.53270: done getting variables 18823 1726855050.53312: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:57:30 -0400 (0:00:00.079) 0:00:42.185 ****** 18823 1726855050.53342: entering _queue_task() for managed_node2/set_fact 18823 1726855050.53675: worker is 1 (out of 1 available) 18823 1726855050.53690: exiting _queue_task() for managed_node2/set_fact 18823 1726855050.53704: done queuing things up, now waiting for results queue to drain 18823 1726855050.53705: waiting for pending results... 18823 1726855050.54053: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 18823 1726855050.54117: in run() - task 0affcc66-ac2b-d391-077c-000000000502 18823 1726855050.54136: variable 'ansible_search_path' from source: unknown 18823 1726855050.54258: variable 'ansible_search_path' from source: unknown 18823 1726855050.54262: calling self._execute() 18823 1726855050.54281: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.54295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.54314: variable 'omit' from source: magic vars 18823 1726855050.54701: variable 'ansible_distribution_major_version' from source: facts 18823 1726855050.54724: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855050.54742: variable 'omit' from source: magic vars 18823 1726855050.54794: variable 'omit' from source: magic vars 18823 1726855050.54867: variable 'omit' from source: magic vars 18823 1726855050.54923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855050.54962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855050.54990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855050.55095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855050.55098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855050.55100: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855050.55102: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.55106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.55181: Set connection var ansible_timeout to 10 18823 1726855050.55250: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855050.55258: Set connection var ansible_shell_type to sh 18823 1726855050.55269: Set connection var ansible_shell_executable to /bin/sh 18823 1726855050.55279: Set connection var ansible_connection to ssh 18823 1726855050.55291: Set connection var ansible_pipelining to False 18823 1726855050.55326: variable 'ansible_shell_executable' from source: unknown 18823 1726855050.55334: variable 'ansible_connection' from source: unknown 18823 1726855050.55361: variable 'ansible_module_compression' from source: unknown 18823 1726855050.55459: variable 'ansible_shell_type' from source: unknown 18823 1726855050.55463: variable 'ansible_shell_executable' from source: unknown 18823 1726855050.55466: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.55468: variable 'ansible_pipelining' from source: unknown 18823 1726855050.55470: variable 'ansible_timeout' from source: unknown 18823 1726855050.55473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.55562: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855050.55582: variable 'omit' from source: magic vars 18823 1726855050.55595: starting attempt loop 18823 1726855050.55603: running the handler 18823 1726855050.55680: handler run complete 18823 1726855050.55683: attempt loop complete, returning result 18823 1726855050.55686: _execute() done 18823 1726855050.55689: dumping result to json 18823 1726855050.55692: done dumping result, returning 18823 1726855050.55694: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-d391-077c-000000000502] 18823 1726855050.55696: sending task result for task 0affcc66-ac2b-d391-077c-000000000502 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 18823 1726855050.55827: no more pending results, returning what we have 18823 1726855050.55830: results queue empty 18823 1726855050.55831: checking for any_errors_fatal 18823 1726855050.55833: done checking for any_errors_fatal 18823 1726855050.55833: checking for max_fail_percentage 18823 1726855050.55835: done checking for max_fail_percentage 18823 1726855050.55835: checking to see if all hosts have failed and the running result is not ok 18823 1726855050.55836: done checking to see if all hosts have failed 18823 1726855050.55837: getting the remaining hosts for this loop 18823 1726855050.55838: done getting the remaining hosts for this loop 18823 1726855050.55842: getting the next task for host managed_node2 18823 1726855050.55900: done getting next task for host managed_node2 18823 1726855050.55903: ^ task is: TASK: Stat profile file 18823 1726855050.55908: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855050.55913: getting variables 18823 1726855050.55915: in VariableManager get_vars() 18823 1726855050.55942: Calling all_inventory to load vars for managed_node2 18823 1726855050.55944: Calling groups_inventory to load vars for managed_node2 18823 1726855050.55947: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.55962: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.55965: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.55968: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.56555: done sending task result for task 0affcc66-ac2b-d391-077c-000000000502 18823 1726855050.56558: WORKER PROCESS EXITING 18823 1726855050.58570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855050.60758: done with get_vars() 18823 1726855050.60798: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:57:30 -0400 (0:00:00.075) 0:00:42.260 ****** 18823 1726855050.60907: entering _queue_task() for managed_node2/stat 18823 1726855050.61273: worker is 1 (out of 1 available) 18823 1726855050.61285: exiting _queue_task() for managed_node2/stat 18823 1726855050.61298: done queuing things up, now waiting for results queue to drain 18823 1726855050.61300: waiting for pending results... 18823 1726855050.61586: running TaskExecutor() for managed_node2/TASK: Stat profile file 18823 1726855050.61798: in run() - task 0affcc66-ac2b-d391-077c-000000000503 18823 1726855050.61802: variable 'ansible_search_path' from source: unknown 18823 1726855050.61809: variable 'ansible_search_path' from source: unknown 18823 1726855050.61812: calling self._execute() 18823 1726855050.61892: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.61910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.61925: variable 'omit' from source: magic vars 18823 1726855050.62313: variable 'ansible_distribution_major_version' from source: facts 18823 1726855050.62333: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855050.62346: variable 'omit' from source: magic vars 18823 1726855050.62396: variable 'omit' from source: magic vars 18823 1726855050.62558: variable 'profile' from source: include params 18823 1726855050.62561: variable 'interface' from source: set_fact 18823 1726855050.62594: variable 'interface' from source: set_fact 18823 1726855050.62622: variable 'omit' from source: magic vars 18823 1726855050.62672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855050.62716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855050.62742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855050.62763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855050.62783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855050.62823: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855050.62883: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.62886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.62950: Set connection var ansible_timeout to 10 18823 1726855050.62961: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855050.62967: Set connection var ansible_shell_type to sh 18823 1726855050.62977: Set connection var ansible_shell_executable to /bin/sh 18823 1726855050.62988: Set connection var ansible_connection to ssh 18823 1726855050.63002: Set connection var ansible_pipelining to False 18823 1726855050.63093: variable 'ansible_shell_executable' from source: unknown 18823 1726855050.63096: variable 'ansible_connection' from source: unknown 18823 1726855050.63100: variable 'ansible_module_compression' from source: unknown 18823 1726855050.63103: variable 'ansible_shell_type' from source: unknown 18823 1726855050.63107: variable 'ansible_shell_executable' from source: unknown 18823 1726855050.63109: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855050.63111: variable 'ansible_pipelining' from source: unknown 18823 1726855050.63113: variable 'ansible_timeout' from source: unknown 18823 1726855050.63115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855050.63343: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855050.63347: variable 'omit' from source: magic vars 18823 1726855050.63349: starting attempt loop 18823 1726855050.63351: running the handler 18823 1726855050.63377: _low_level_execute_command(): starting 18823 1726855050.63393: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855050.64215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855050.64247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855050.64302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855050.64384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.66162: stdout chunk (state=3): >>>/root <<< 18823 1726855050.66298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855050.66418: stderr chunk (state=3): >>><<< 18823 1726855050.66421: stdout chunk (state=3): >>><<< 18823 1726855050.66424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855050.66427: _low_level_execute_command(): starting 18823 1726855050.66430: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749 `" && echo ansible-tmp-1726855050.663559-20801-154462948822749="` echo /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749 `" ) && sleep 0' 18823 1726855050.67055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855050.67077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855050.67202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855050.67229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855050.67245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855050.67349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.69417: stdout chunk (state=3): >>>ansible-tmp-1726855050.663559-20801-154462948822749=/root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749 <<< 18823 1726855050.69481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855050.69484: stdout chunk (state=3): >>><<< 18823 1726855050.69697: stderr chunk (state=3): >>><<< 18823 1726855050.69701: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855050.663559-20801-154462948822749=/root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855050.69706: variable 'ansible_module_compression' from source: unknown 18823 1726855050.69836: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18823 1726855050.69878: variable 'ansible_facts' from source: unknown 18823 1726855050.70054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/AnsiballZ_stat.py 18823 1726855050.70530: Sending initial data 18823 1726855050.70540: Sent initial data (152 bytes) 18823 1726855050.71021: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855050.71037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855050.71109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855050.71160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855050.71178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855050.71197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855050.71303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.72893: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855050.72956: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855050.73041: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmps4jmkowd /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/AnsiballZ_stat.py <<< 18823 1726855050.73073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/AnsiballZ_stat.py" <<< 18823 1726855050.73162: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmps4jmkowd" to remote "/root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/AnsiballZ_stat.py" <<< 18823 1726855050.74311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855050.74383: stderr chunk (state=3): >>><<< 18823 1726855050.74422: stdout chunk (state=3): >>><<< 18823 1726855050.74672: done transferring module to remote 18823 1726855050.74675: _low_level_execute_command(): starting 18823 1726855050.74677: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/ /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/AnsiballZ_stat.py && sleep 0' 18823 1726855050.75654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855050.75830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855050.75833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855050.75836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855050.75838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855050.75891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855050.75973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.77755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855050.77792: stderr chunk (state=3): >>><<< 18823 1726855050.77992: stdout chunk (state=3): >>><<< 18823 1726855050.77997: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855050.78000: _low_level_execute_command(): starting 18823 1726855050.78002: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/AnsiballZ_stat.py && sleep 0' 18823 1726855050.79019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855050.79047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855050.79064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855050.79083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855050.79115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855050.79213: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855050.79302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855050.79333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855050.79352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855050.79471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.94498: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18823 1726855050.95800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855050.95825: stderr chunk (state=3): >>><<< 18823 1726855050.95828: stdout chunk (state=3): >>><<< 18823 1726855050.95843: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855050.95870: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855050.95877: _low_level_execute_command(): starting 18823 1726855050.95882: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855050.663559-20801-154462948822749/ > /dev/null 2>&1 && sleep 0' 18823 1726855050.96326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855050.96329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855050.96331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855050.96380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855050.96383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855050.96463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855050.98292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855050.98318: stderr chunk (state=3): >>><<< 18823 1726855050.98323: stdout chunk (state=3): >>><<< 18823 1726855050.98338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855050.98343: handler run complete 18823 1726855050.98359: attempt loop complete, returning result 18823 1726855050.98361: _execute() done 18823 1726855050.98364: dumping result to json 18823 1726855050.98368: done dumping result, returning 18823 1726855050.98377: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affcc66-ac2b-d391-077c-000000000503] 18823 1726855050.98382: sending task result for task 0affcc66-ac2b-d391-077c-000000000503 18823 1726855050.98477: done sending task result for task 0affcc66-ac2b-d391-077c-000000000503 18823 1726855050.98481: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 18823 1726855050.98536: no more pending results, returning what we have 18823 1726855050.98540: results queue empty 18823 1726855050.98541: checking for any_errors_fatal 18823 1726855050.98547: done checking for any_errors_fatal 18823 1726855050.98547: checking for max_fail_percentage 18823 1726855050.98549: done checking for max_fail_percentage 18823 1726855050.98550: checking to see if all hosts have failed and the running result is not ok 18823 1726855050.98550: done checking to see if all hosts have failed 18823 1726855050.98551: getting the remaining hosts for this loop 18823 1726855050.98552: done getting the remaining hosts for this loop 18823 1726855050.98556: getting the next task for host managed_node2 18823 1726855050.98564: done getting next task for host managed_node2 18823 1726855050.98566: ^ task is: TASK: Set NM profile exist flag based on the profile files 18823 1726855050.98569: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855050.98572: getting variables 18823 1726855050.98574: in VariableManager get_vars() 18823 1726855050.98607: Calling all_inventory to load vars for managed_node2 18823 1726855050.98609: Calling groups_inventory to load vars for managed_node2 18823 1726855050.98613: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855050.98625: Calling all_plugins_play to load vars for managed_node2 18823 1726855050.98627: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855050.98630: Calling groups_plugins_play to load vars for managed_node2 18823 1726855050.99453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.03614: done with get_vars() 18823 1726855051.03632: done getting variables 18823 1726855051.03669: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:57:31 -0400 (0:00:00.427) 0:00:42.688 ****** 18823 1726855051.03689: entering _queue_task() for managed_node2/set_fact 18823 1726855051.03949: worker is 1 (out of 1 available) 18823 1726855051.03961: exiting _queue_task() for managed_node2/set_fact 18823 1726855051.03974: done queuing things up, now waiting for results queue to drain 18823 1726855051.03975: waiting for pending results... 18823 1726855051.04152: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 18823 1726855051.04253: in run() - task 0affcc66-ac2b-d391-077c-000000000504 18823 1726855051.04263: variable 'ansible_search_path' from source: unknown 18823 1726855051.04267: variable 'ansible_search_path' from source: unknown 18823 1726855051.04295: calling self._execute() 18823 1726855051.04367: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.04372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.04382: variable 'omit' from source: magic vars 18823 1726855051.04671: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.04680: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.04766: variable 'profile_stat' from source: set_fact 18823 1726855051.04779: Evaluated conditional (profile_stat.stat.exists): False 18823 1726855051.04782: when evaluation is False, skipping this task 18823 1726855051.04785: _execute() done 18823 1726855051.04789: dumping result to json 18823 1726855051.04792: done dumping result, returning 18823 1726855051.04798: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-d391-077c-000000000504] 18823 1726855051.04803: sending task result for task 0affcc66-ac2b-d391-077c-000000000504 18823 1726855051.04896: done sending task result for task 0affcc66-ac2b-d391-077c-000000000504 18823 1726855051.04899: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18823 1726855051.04942: no more pending results, returning what we have 18823 1726855051.04945: results queue empty 18823 1726855051.04946: checking for any_errors_fatal 18823 1726855051.04957: done checking for any_errors_fatal 18823 1726855051.04957: checking for max_fail_percentage 18823 1726855051.04959: done checking for max_fail_percentage 18823 1726855051.04960: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.04960: done checking to see if all hosts have failed 18823 1726855051.04961: getting the remaining hosts for this loop 18823 1726855051.04962: done getting the remaining hosts for this loop 18823 1726855051.04966: getting the next task for host managed_node2 18823 1726855051.04973: done getting next task for host managed_node2 18823 1726855051.04976: ^ task is: TASK: Get NM profile info 18823 1726855051.04979: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.04984: getting variables 18823 1726855051.04985: in VariableManager get_vars() 18823 1726855051.05014: Calling all_inventory to load vars for managed_node2 18823 1726855051.05017: Calling groups_inventory to load vars for managed_node2 18823 1726855051.05020: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.05033: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.05036: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.05038: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.05821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.06707: done with get_vars() 18823 1726855051.06728: done getting variables 18823 1726855051.06797: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:57:31 -0400 (0:00:00.031) 0:00:42.719 ****** 18823 1726855051.06819: entering _queue_task() for managed_node2/shell 18823 1726855051.06821: Creating lock for shell 18823 1726855051.07067: worker is 1 (out of 1 available) 18823 1726855051.07081: exiting _queue_task() for managed_node2/shell 18823 1726855051.07096: done queuing things up, now waiting for results queue to drain 18823 1726855051.07097: waiting for pending results... 18823 1726855051.07273: running TaskExecutor() for managed_node2/TASK: Get NM profile info 18823 1726855051.07365: in run() - task 0affcc66-ac2b-d391-077c-000000000505 18823 1726855051.07376: variable 'ansible_search_path' from source: unknown 18823 1726855051.07379: variable 'ansible_search_path' from source: unknown 18823 1726855051.07412: calling self._execute() 18823 1726855051.07483: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.07489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.07500: variable 'omit' from source: magic vars 18823 1726855051.07789: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.07803: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.07809: variable 'omit' from source: magic vars 18823 1726855051.07846: variable 'omit' from source: magic vars 18823 1726855051.07920: variable 'profile' from source: include params 18823 1726855051.07924: variable 'interface' from source: set_fact 18823 1726855051.07975: variable 'interface' from source: set_fact 18823 1726855051.07991: variable 'omit' from source: magic vars 18823 1726855051.08027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855051.08053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855051.08070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855051.08083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855051.08097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855051.08127: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855051.08131: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.08134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.08203: Set connection var ansible_timeout to 10 18823 1726855051.08213: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855051.08215: Set connection var ansible_shell_type to sh 18823 1726855051.08220: Set connection var ansible_shell_executable to /bin/sh 18823 1726855051.08225: Set connection var ansible_connection to ssh 18823 1726855051.08230: Set connection var ansible_pipelining to False 18823 1726855051.08250: variable 'ansible_shell_executable' from source: unknown 18823 1726855051.08253: variable 'ansible_connection' from source: unknown 18823 1726855051.08257: variable 'ansible_module_compression' from source: unknown 18823 1726855051.08259: variable 'ansible_shell_type' from source: unknown 18823 1726855051.08262: variable 'ansible_shell_executable' from source: unknown 18823 1726855051.08265: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.08267: variable 'ansible_pipelining' from source: unknown 18823 1726855051.08270: variable 'ansible_timeout' from source: unknown 18823 1726855051.08272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.08375: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855051.08383: variable 'omit' from source: magic vars 18823 1726855051.08389: starting attempt loop 18823 1726855051.08392: running the handler 18823 1726855051.08401: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855051.08424: _low_level_execute_command(): starting 18823 1726855051.08433: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855051.08946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855051.08951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.08953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855051.08956: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855051.08959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.09014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855051.09017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855051.09020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855051.09110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855051.10788: stdout chunk (state=3): >>>/root <<< 18823 1726855051.10885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855051.10916: stderr chunk (state=3): >>><<< 18823 1726855051.10920: stdout chunk (state=3): >>><<< 18823 1726855051.10940: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855051.10951: _low_level_execute_command(): starting 18823 1726855051.10955: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125 `" && echo ansible-tmp-1726855051.1093884-20833-67861472628125="` echo /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125 `" ) && sleep 0' 18823 1726855051.11390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855051.11394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.11397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855051.11399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855051.11402: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.11446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855051.11455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855051.11525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855051.13414: stdout chunk (state=3): >>>ansible-tmp-1726855051.1093884-20833-67861472628125=/root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125 <<< 18823 1726855051.13523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855051.13551: stderr chunk (state=3): >>><<< 18823 1726855051.13554: stdout chunk (state=3): >>><<< 18823 1726855051.13568: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855051.1093884-20833-67861472628125=/root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855051.13596: variable 'ansible_module_compression' from source: unknown 18823 1726855051.13643: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855051.13675: variable 'ansible_facts' from source: unknown 18823 1726855051.13731: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/AnsiballZ_command.py 18823 1726855051.13833: Sending initial data 18823 1726855051.13837: Sent initial data (155 bytes) 18823 1726855051.14281: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855051.14286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855051.14299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855051.14302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855051.14305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.14343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855051.14346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855051.14424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855051.15957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18823 1726855051.15964: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855051.16029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855051.16098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp79tj31vq /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/AnsiballZ_command.py <<< 18823 1726855051.16101: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/AnsiballZ_command.py" <<< 18823 1726855051.16163: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp79tj31vq" to remote "/root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/AnsiballZ_command.py" <<< 18823 1726855051.16171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/AnsiballZ_command.py" <<< 18823 1726855051.16799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855051.16837: stderr chunk (state=3): >>><<< 18823 1726855051.16841: stdout chunk (state=3): >>><<< 18823 1726855051.16879: done transferring module to remote 18823 1726855051.16894: _low_level_execute_command(): starting 18823 1726855051.16897: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/ /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/AnsiballZ_command.py && sleep 0' 18823 1726855051.17326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855051.17335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855051.17338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.17340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855051.17343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.17375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855051.17379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855051.17459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855051.19186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855051.19214: stderr chunk (state=3): >>><<< 18823 1726855051.19217: stdout chunk (state=3): >>><<< 18823 1726855051.19229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855051.19232: _low_level_execute_command(): starting 18823 1726855051.19235: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/AnsiballZ_command.py && sleep 0' 18823 1726855051.19632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855051.19635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.19647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.19693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855051.19713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855051.19780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855051.36512: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 13:57:31.346688", "end": "2024-09-20 13:57:31.362757", "delta": "0:00:00.016069", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855051.38296: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.178 closed. <<< 18823 1726855051.38301: stdout chunk (state=3): >>><<< 18823 1726855051.38303: stderr chunk (state=3): >>><<< 18823 1726855051.38307: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 13:57:31.346688", "end": "2024-09-20 13:57:31.362757", "delta": "0:00:00.016069", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.178 closed. 18823 1726855051.38310: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855051.38316: _low_level_execute_command(): starting 18823 1726855051.38319: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855051.1093884-20833-67861472628125/ > /dev/null 2>&1 && sleep 0' 18823 1726855051.39346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855051.39350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855051.39352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855051.39354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855051.39356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855051.39358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 18823 1726855051.39360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855051.39411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855051.39437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855051.39543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855051.41693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855051.41696: stdout chunk (state=3): >>><<< 18823 1726855051.41698: stderr chunk (state=3): >>><<< 18823 1726855051.41793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855051.41797: handler run complete 18823 1726855051.41800: Evaluated conditional (False): False 18823 1726855051.41802: attempt loop complete, returning result 18823 1726855051.41806: _execute() done 18823 1726855051.41809: dumping result to json 18823 1726855051.41811: done dumping result, returning 18823 1726855051.41813: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affcc66-ac2b-d391-077c-000000000505] 18823 1726855051.41815: sending task result for task 0affcc66-ac2b-d391-077c-000000000505 18823 1726855051.42007: done sending task result for task 0affcc66-ac2b-d391-077c-000000000505 18823 1726855051.42010: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.016069", "end": "2024-09-20 13:57:31.362757", "rc": 1, "start": "2024-09-20 13:57:31.346688" } MSG: non-zero return code ...ignoring 18823 1726855051.42086: no more pending results, returning what we have 18823 1726855051.42091: results queue empty 18823 1726855051.42092: checking for any_errors_fatal 18823 1726855051.42098: done checking for any_errors_fatal 18823 1726855051.42099: checking for max_fail_percentage 18823 1726855051.42100: done checking for max_fail_percentage 18823 1726855051.42101: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.42102: done checking to see if all hosts have failed 18823 1726855051.42103: getting the remaining hosts for this loop 18823 1726855051.42106: done getting the remaining hosts for this loop 18823 1726855051.42110: getting the next task for host managed_node2 18823 1726855051.42119: done getting next task for host managed_node2 18823 1726855051.42122: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18823 1726855051.42126: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.42130: getting variables 18823 1726855051.42131: in VariableManager get_vars() 18823 1726855051.42162: Calling all_inventory to load vars for managed_node2 18823 1726855051.42165: Calling groups_inventory to load vars for managed_node2 18823 1726855051.42168: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.42181: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.42184: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.42590: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.46111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.49467: done with get_vars() 18823 1726855051.49498: done getting variables 18823 1726855051.49562: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:57:31 -0400 (0:00:00.427) 0:00:43.147 ****** 18823 1726855051.49596: entering _queue_task() for managed_node2/set_fact 18823 1726855051.49945: worker is 1 (out of 1 available) 18823 1726855051.49956: exiting _queue_task() for managed_node2/set_fact 18823 1726855051.49967: done queuing things up, now waiting for results queue to drain 18823 1726855051.49968: waiting for pending results... 18823 1726855051.50260: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18823 1726855051.50408: in run() - task 0affcc66-ac2b-d391-077c-000000000506 18823 1726855051.50593: variable 'ansible_search_path' from source: unknown 18823 1726855051.50597: variable 'ansible_search_path' from source: unknown 18823 1726855051.50601: calling self._execute() 18823 1726855051.50606: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.50609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.50612: variable 'omit' from source: magic vars 18823 1726855051.51016: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.51036: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.51192: variable 'nm_profile_exists' from source: set_fact 18823 1726855051.51217: Evaluated conditional (nm_profile_exists.rc == 0): False 18823 1726855051.51224: when evaluation is False, skipping this task 18823 1726855051.51231: _execute() done 18823 1726855051.51239: dumping result to json 18823 1726855051.51245: done dumping result, returning 18823 1726855051.51256: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-d391-077c-000000000506] 18823 1726855051.51270: sending task result for task 0affcc66-ac2b-d391-077c-000000000506 skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 18823 1726855051.51425: no more pending results, returning what we have 18823 1726855051.51430: results queue empty 18823 1726855051.51431: checking for any_errors_fatal 18823 1726855051.51440: done checking for any_errors_fatal 18823 1726855051.51441: checking for max_fail_percentage 18823 1726855051.51442: done checking for max_fail_percentage 18823 1726855051.51443: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.51444: done checking to see if all hosts have failed 18823 1726855051.51445: getting the remaining hosts for this loop 18823 1726855051.51447: done getting the remaining hosts for this loop 18823 1726855051.51450: getting the next task for host managed_node2 18823 1726855051.51462: done getting next task for host managed_node2 18823 1726855051.51465: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 18823 1726855051.51469: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.51475: getting variables 18823 1726855051.51476: in VariableManager get_vars() 18823 1726855051.51514: Calling all_inventory to load vars for managed_node2 18823 1726855051.51517: Calling groups_inventory to load vars for managed_node2 18823 1726855051.51521: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.51536: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.51539: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.51542: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.52322: done sending task result for task 0affcc66-ac2b-d391-077c-000000000506 18823 1726855051.52326: WORKER PROCESS EXITING 18823 1726855051.53198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.54707: done with get_vars() 18823 1726855051.54732: done getting variables 18823 1726855051.54793: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855051.54926: variable 'profile' from source: include params 18823 1726855051.54931: variable 'interface' from source: set_fact 18823 1726855051.55008: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:57:31 -0400 (0:00:00.054) 0:00:43.202 ****** 18823 1726855051.55040: entering _queue_task() for managed_node2/command 18823 1726855051.55398: worker is 1 (out of 1 available) 18823 1726855051.55414: exiting _queue_task() for managed_node2/command 18823 1726855051.55426: done queuing things up, now waiting for results queue to drain 18823 1726855051.55427: waiting for pending results... 18823 1726855051.55710: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 18823 1726855051.55853: in run() - task 0affcc66-ac2b-d391-077c-000000000508 18823 1726855051.55877: variable 'ansible_search_path' from source: unknown 18823 1726855051.56031: variable 'ansible_search_path' from source: unknown 18823 1726855051.56074: calling self._execute() 18823 1726855051.56179: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.56194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.56213: variable 'omit' from source: magic vars 18823 1726855051.56623: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.56640: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.56770: variable 'profile_stat' from source: set_fact 18823 1726855051.56799: Evaluated conditional (profile_stat.stat.exists): False 18823 1726855051.56809: when evaluation is False, skipping this task 18823 1726855051.56817: _execute() done 18823 1726855051.56825: dumping result to json 18823 1726855051.56831: done dumping result, returning 18823 1726855051.56841: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 [0affcc66-ac2b-d391-077c-000000000508] 18823 1726855051.56848: sending task result for task 0affcc66-ac2b-d391-077c-000000000508 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18823 1726855051.57014: no more pending results, returning what we have 18823 1726855051.57018: results queue empty 18823 1726855051.57019: checking for any_errors_fatal 18823 1726855051.57027: done checking for any_errors_fatal 18823 1726855051.57028: checking for max_fail_percentage 18823 1726855051.57030: done checking for max_fail_percentage 18823 1726855051.57031: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.57032: done checking to see if all hosts have failed 18823 1726855051.57033: getting the remaining hosts for this loop 18823 1726855051.57034: done getting the remaining hosts for this loop 18823 1726855051.57038: getting the next task for host managed_node2 18823 1726855051.57048: done getting next task for host managed_node2 18823 1726855051.57050: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 18823 1726855051.57055: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.57060: getting variables 18823 1726855051.57062: in VariableManager get_vars() 18823 1726855051.57098: Calling all_inventory to load vars for managed_node2 18823 1726855051.57101: Calling groups_inventory to load vars for managed_node2 18823 1726855051.57108: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.57124: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.57128: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.57132: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.58005: done sending task result for task 0affcc66-ac2b-d391-077c-000000000508 18823 1726855051.58009: WORKER PROCESS EXITING 18823 1726855051.58973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.61470: done with get_vars() 18823 1726855051.61497: done getting variables 18823 1726855051.61557: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855051.61673: variable 'profile' from source: include params 18823 1726855051.61677: variable 'interface' from source: set_fact 18823 1726855051.61740: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:57:31 -0400 (0:00:00.067) 0:00:43.269 ****** 18823 1726855051.61772: entering _queue_task() for managed_node2/set_fact 18823 1726855051.62110: worker is 1 (out of 1 available) 18823 1726855051.62120: exiting _queue_task() for managed_node2/set_fact 18823 1726855051.62131: done queuing things up, now waiting for results queue to drain 18823 1726855051.62131: waiting for pending results... 18823 1726855051.62458: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 18823 1726855051.62670: in run() - task 0affcc66-ac2b-d391-077c-000000000509 18823 1726855051.62693: variable 'ansible_search_path' from source: unknown 18823 1726855051.62701: variable 'ansible_search_path' from source: unknown 18823 1726855051.62763: calling self._execute() 18823 1726855051.62879: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.62891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.62908: variable 'omit' from source: magic vars 18823 1726855051.63298: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.63317: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.63450: variable 'profile_stat' from source: set_fact 18823 1726855051.63468: Evaluated conditional (profile_stat.stat.exists): False 18823 1726855051.63475: when evaluation is False, skipping this task 18823 1726855051.63490: _execute() done 18823 1726855051.63501: dumping result to json 18823 1726855051.63595: done dumping result, returning 18823 1726855051.63600: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [0affcc66-ac2b-d391-077c-000000000509] 18823 1726855051.63602: sending task result for task 0affcc66-ac2b-d391-077c-000000000509 18823 1726855051.63671: done sending task result for task 0affcc66-ac2b-d391-077c-000000000509 18823 1726855051.63674: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18823 1726855051.63732: no more pending results, returning what we have 18823 1726855051.63736: results queue empty 18823 1726855051.63737: checking for any_errors_fatal 18823 1726855051.63745: done checking for any_errors_fatal 18823 1726855051.63746: checking for max_fail_percentage 18823 1726855051.63748: done checking for max_fail_percentage 18823 1726855051.63749: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.63749: done checking to see if all hosts have failed 18823 1726855051.63750: getting the remaining hosts for this loop 18823 1726855051.63752: done getting the remaining hosts for this loop 18823 1726855051.63756: getting the next task for host managed_node2 18823 1726855051.63765: done getting next task for host managed_node2 18823 1726855051.63768: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 18823 1726855051.63772: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.63777: getting variables 18823 1726855051.63779: in VariableManager get_vars() 18823 1726855051.63899: Calling all_inventory to load vars for managed_node2 18823 1726855051.63903: Calling groups_inventory to load vars for managed_node2 18823 1726855051.63909: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.63930: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.63934: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.63937: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.66113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.67961: done with get_vars() 18823 1726855051.67991: done getting variables 18823 1726855051.68060: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855051.68181: variable 'profile' from source: include params 18823 1726855051.68185: variable 'interface' from source: set_fact 18823 1726855051.68253: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:57:31 -0400 (0:00:00.065) 0:00:43.334 ****** 18823 1726855051.68285: entering _queue_task() for managed_node2/command 18823 1726855051.68654: worker is 1 (out of 1 available) 18823 1726855051.68780: exiting _queue_task() for managed_node2/command 18823 1726855051.68793: done queuing things up, now waiting for results queue to drain 18823 1726855051.68794: waiting for pending results... 18823 1726855051.69116: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr27 18823 1726855051.69131: in run() - task 0affcc66-ac2b-d391-077c-00000000050a 18823 1726855051.69154: variable 'ansible_search_path' from source: unknown 18823 1726855051.69162: variable 'ansible_search_path' from source: unknown 18823 1726855051.69211: calling self._execute() 18823 1726855051.69326: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.69347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.69430: variable 'omit' from source: magic vars 18823 1726855051.69759: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.69782: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.69915: variable 'profile_stat' from source: set_fact 18823 1726855051.69932: Evaluated conditional (profile_stat.stat.exists): False 18823 1726855051.69937: when evaluation is False, skipping this task 18823 1726855051.69944: _execute() done 18823 1726855051.69950: dumping result to json 18823 1726855051.69956: done dumping result, returning 18823 1726855051.69964: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr27 [0affcc66-ac2b-d391-077c-00000000050a] 18823 1726855051.69978: sending task result for task 0affcc66-ac2b-d391-077c-00000000050a skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18823 1726855051.70238: no more pending results, returning what we have 18823 1726855051.70242: results queue empty 18823 1726855051.70243: checking for any_errors_fatal 18823 1726855051.70249: done checking for any_errors_fatal 18823 1726855051.70250: checking for max_fail_percentage 18823 1726855051.70252: done checking for max_fail_percentage 18823 1726855051.70252: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.70253: done checking to see if all hosts have failed 18823 1726855051.70254: getting the remaining hosts for this loop 18823 1726855051.70255: done getting the remaining hosts for this loop 18823 1726855051.70259: getting the next task for host managed_node2 18823 1726855051.70267: done getting next task for host managed_node2 18823 1726855051.70270: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 18823 1726855051.70274: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.70278: getting variables 18823 1726855051.70280: in VariableManager get_vars() 18823 1726855051.70312: Calling all_inventory to load vars for managed_node2 18823 1726855051.70315: Calling groups_inventory to load vars for managed_node2 18823 1726855051.70319: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.70333: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.70336: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.70338: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.71133: done sending task result for task 0affcc66-ac2b-d391-077c-00000000050a 18823 1726855051.71136: WORKER PROCESS EXITING 18823 1726855051.72068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.73715: done with get_vars() 18823 1726855051.73735: done getting variables 18823 1726855051.73793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855051.73900: variable 'profile' from source: include params 18823 1726855051.73906: variable 'interface' from source: set_fact 18823 1726855051.73961: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:57:31 -0400 (0:00:00.057) 0:00:43.391 ****** 18823 1726855051.73997: entering _queue_task() for managed_node2/set_fact 18823 1726855051.74509: worker is 1 (out of 1 available) 18823 1726855051.74519: exiting _queue_task() for managed_node2/set_fact 18823 1726855051.74529: done queuing things up, now waiting for results queue to drain 18823 1726855051.74530: waiting for pending results... 18823 1726855051.74638: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 18823 1726855051.74786: in run() - task 0affcc66-ac2b-d391-077c-00000000050b 18823 1726855051.74810: variable 'ansible_search_path' from source: unknown 18823 1726855051.74818: variable 'ansible_search_path' from source: unknown 18823 1726855051.74864: calling self._execute() 18823 1726855051.74977: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.74985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.75007: variable 'omit' from source: magic vars 18823 1726855051.75385: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.75416: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.75544: variable 'profile_stat' from source: set_fact 18823 1726855051.75562: Evaluated conditional (profile_stat.stat.exists): False 18823 1726855051.75569: when evaluation is False, skipping this task 18823 1726855051.75575: _execute() done 18823 1726855051.75583: dumping result to json 18823 1726855051.75592: done dumping result, returning 18823 1726855051.75602: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 [0affcc66-ac2b-d391-077c-00000000050b] 18823 1726855051.75631: sending task result for task 0affcc66-ac2b-d391-077c-00000000050b skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18823 1726855051.75782: no more pending results, returning what we have 18823 1726855051.75786: results queue empty 18823 1726855051.75788: checking for any_errors_fatal 18823 1726855051.75798: done checking for any_errors_fatal 18823 1726855051.75799: checking for max_fail_percentage 18823 1726855051.75801: done checking for max_fail_percentage 18823 1726855051.75802: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.75803: done checking to see if all hosts have failed 18823 1726855051.75806: getting the remaining hosts for this loop 18823 1726855051.75808: done getting the remaining hosts for this loop 18823 1726855051.75812: getting the next task for host managed_node2 18823 1726855051.75822: done getting next task for host managed_node2 18823 1726855051.75825: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 18823 1726855051.75829: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.75834: getting variables 18823 1726855051.75835: in VariableManager get_vars() 18823 1726855051.75866: Calling all_inventory to load vars for managed_node2 18823 1726855051.75869: Calling groups_inventory to load vars for managed_node2 18823 1726855051.75873: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.75993: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.75998: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.76002: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.76613: done sending task result for task 0affcc66-ac2b-d391-077c-00000000050b 18823 1726855051.76616: WORKER PROCESS EXITING 18823 1726855051.77573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.79420: done with get_vars() 18823 1726855051.79444: done getting variables 18823 1726855051.79518: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855051.79641: variable 'profile' from source: include params 18823 1726855051.79645: variable 'interface' from source: set_fact 18823 1726855051.79713: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:57:31 -0400 (0:00:00.057) 0:00:43.449 ****** 18823 1726855051.79744: entering _queue_task() for managed_node2/assert 18823 1726855051.80297: worker is 1 (out of 1 available) 18823 1726855051.80311: exiting _queue_task() for managed_node2/assert 18823 1726855051.80323: done queuing things up, now waiting for results queue to drain 18823 1726855051.80324: waiting for pending results... 18823 1726855051.80450: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'lsr27' 18823 1726855051.80582: in run() - task 0affcc66-ac2b-d391-077c-0000000004f6 18823 1726855051.80610: variable 'ansible_search_path' from source: unknown 18823 1726855051.80680: variable 'ansible_search_path' from source: unknown 18823 1726855051.80684: calling self._execute() 18823 1726855051.80768: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.80783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.80802: variable 'omit' from source: magic vars 18823 1726855051.81161: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.81178: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.81190: variable 'omit' from source: magic vars 18823 1726855051.81237: variable 'omit' from source: magic vars 18823 1726855051.81340: variable 'profile' from source: include params 18823 1726855051.81348: variable 'interface' from source: set_fact 18823 1726855051.81414: variable 'interface' from source: set_fact 18823 1726855051.81548: variable 'omit' from source: magic vars 18823 1726855051.81552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855051.81554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855051.81560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855051.81582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855051.81600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855051.81639: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855051.81654: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.81666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.81780: Set connection var ansible_timeout to 10 18823 1726855051.81876: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855051.81879: Set connection var ansible_shell_type to sh 18823 1726855051.81881: Set connection var ansible_shell_executable to /bin/sh 18823 1726855051.81883: Set connection var ansible_connection to ssh 18823 1726855051.81886: Set connection var ansible_pipelining to False 18823 1726855051.81889: variable 'ansible_shell_executable' from source: unknown 18823 1726855051.81892: variable 'ansible_connection' from source: unknown 18823 1726855051.81894: variable 'ansible_module_compression' from source: unknown 18823 1726855051.81896: variable 'ansible_shell_type' from source: unknown 18823 1726855051.81898: variable 'ansible_shell_executable' from source: unknown 18823 1726855051.81900: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.81902: variable 'ansible_pipelining' from source: unknown 18823 1726855051.81907: variable 'ansible_timeout' from source: unknown 18823 1726855051.81910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.82055: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855051.82072: variable 'omit' from source: magic vars 18823 1726855051.82084: starting attempt loop 18823 1726855051.82102: running the handler 18823 1726855051.82235: variable 'lsr_net_profile_exists' from source: set_fact 18823 1726855051.82246: Evaluated conditional (not lsr_net_profile_exists): True 18823 1726855051.82256: handler run complete 18823 1726855051.82275: attempt loop complete, returning result 18823 1726855051.82314: _execute() done 18823 1726855051.82318: dumping result to json 18823 1726855051.82321: done dumping result, returning 18823 1726855051.82323: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'lsr27' [0affcc66-ac2b-d391-077c-0000000004f6] 18823 1726855051.82325: sending task result for task 0affcc66-ac2b-d391-077c-0000000004f6 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18823 1726855051.82554: no more pending results, returning what we have 18823 1726855051.82558: results queue empty 18823 1726855051.82560: checking for any_errors_fatal 18823 1726855051.82565: done checking for any_errors_fatal 18823 1726855051.82566: checking for max_fail_percentage 18823 1726855051.82568: done checking for max_fail_percentage 18823 1726855051.82569: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.82570: done checking to see if all hosts have failed 18823 1726855051.82571: getting the remaining hosts for this loop 18823 1726855051.82572: done getting the remaining hosts for this loop 18823 1726855051.82576: getting the next task for host managed_node2 18823 1726855051.82585: done getting next task for host managed_node2 18823 1726855051.82591: ^ task is: TASK: Include the task 'assert_device_absent.yml' 18823 1726855051.82593: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.82791: getting variables 18823 1726855051.82794: in VariableManager get_vars() 18823 1726855051.82823: Calling all_inventory to load vars for managed_node2 18823 1726855051.82826: Calling groups_inventory to load vars for managed_node2 18823 1726855051.82829: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.82839: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.82842: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.82845: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.83402: done sending task result for task 0affcc66-ac2b-d391-077c-0000000004f6 18823 1726855051.83408: WORKER PROCESS EXITING 18823 1726855051.84323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.86622: done with get_vars() 18823 1726855051.86651: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 13:57:31 -0400 (0:00:00.070) 0:00:43.519 ****** 18823 1726855051.86782: entering _queue_task() for managed_node2/include_tasks 18823 1726855051.87155: worker is 1 (out of 1 available) 18823 1726855051.87168: exiting _queue_task() for managed_node2/include_tasks 18823 1726855051.87180: done queuing things up, now waiting for results queue to drain 18823 1726855051.87181: waiting for pending results... 18823 1726855051.87425: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 18823 1726855051.87541: in run() - task 0affcc66-ac2b-d391-077c-000000000075 18823 1726855051.87560: variable 'ansible_search_path' from source: unknown 18823 1726855051.87605: calling self._execute() 18823 1726855051.87705: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.87717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.87735: variable 'omit' from source: magic vars 18823 1726855051.88130: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.88146: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.88159: _execute() done 18823 1726855051.88166: dumping result to json 18823 1726855051.88172: done dumping result, returning 18823 1726855051.88181: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [0affcc66-ac2b-d391-077c-000000000075] 18823 1726855051.88191: sending task result for task 0affcc66-ac2b-d391-077c-000000000075 18823 1726855051.88493: done sending task result for task 0affcc66-ac2b-d391-077c-000000000075 18823 1726855051.88497: WORKER PROCESS EXITING 18823 1726855051.88528: no more pending results, returning what we have 18823 1726855051.88533: in VariableManager get_vars() 18823 1726855051.88572: Calling all_inventory to load vars for managed_node2 18823 1726855051.88575: Calling groups_inventory to load vars for managed_node2 18823 1726855051.88579: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.88596: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.88600: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.88604: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.90968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.94036: done with get_vars() 18823 1726855051.94063: variable 'ansible_search_path' from source: unknown 18823 1726855051.94086: we have included files to process 18823 1726855051.94090: generating all_blocks data 18823 1726855051.94092: done generating all_blocks data 18823 1726855051.94098: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18823 1726855051.94100: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18823 1726855051.94103: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18823 1726855051.94274: in VariableManager get_vars() 18823 1726855051.94293: done with get_vars() 18823 1726855051.94400: done processing included file 18823 1726855051.94402: iterating over new_blocks loaded from include file 18823 1726855051.94403: in VariableManager get_vars() 18823 1726855051.94414: done with get_vars() 18823 1726855051.94415: filtering new block on tags 18823 1726855051.94432: done filtering new block on tags 18823 1726855051.94434: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 18823 1726855051.94440: extending task lists for all hosts with included blocks 18823 1726855051.94586: done extending task lists 18823 1726855051.94590: done processing included files 18823 1726855051.94590: results queue empty 18823 1726855051.94591: checking for any_errors_fatal 18823 1726855051.94595: done checking for any_errors_fatal 18823 1726855051.94596: checking for max_fail_percentage 18823 1726855051.94597: done checking for max_fail_percentage 18823 1726855051.94598: checking to see if all hosts have failed and the running result is not ok 18823 1726855051.94599: done checking to see if all hosts have failed 18823 1726855051.94600: getting the remaining hosts for this loop 18823 1726855051.94601: done getting the remaining hosts for this loop 18823 1726855051.94604: getting the next task for host managed_node2 18823 1726855051.94608: done getting next task for host managed_node2 18823 1726855051.94610: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18823 1726855051.94612: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855051.94615: getting variables 18823 1726855051.94615: in VariableManager get_vars() 18823 1726855051.94624: Calling all_inventory to load vars for managed_node2 18823 1726855051.94626: Calling groups_inventory to load vars for managed_node2 18823 1726855051.94628: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.94634: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.94636: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.94639: Calling groups_plugins_play to load vars for managed_node2 18823 1726855051.96021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855051.97911: done with get_vars() 18823 1726855051.97942: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:57:31 -0400 (0:00:00.112) 0:00:43.631 ****** 18823 1726855051.98020: entering _queue_task() for managed_node2/include_tasks 18823 1726855051.98602: worker is 1 (out of 1 available) 18823 1726855051.98610: exiting _queue_task() for managed_node2/include_tasks 18823 1726855051.98620: done queuing things up, now waiting for results queue to drain 18823 1726855051.98621: waiting for pending results... 18823 1726855051.98749: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 18823 1726855051.98864: in run() - task 0affcc66-ac2b-d391-077c-00000000053c 18823 1726855051.98958: variable 'ansible_search_path' from source: unknown 18823 1726855051.98961: variable 'ansible_search_path' from source: unknown 18823 1726855051.98965: calling self._execute() 18823 1726855051.99071: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855051.99082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855051.99107: variable 'omit' from source: magic vars 18823 1726855051.99489: variable 'ansible_distribution_major_version' from source: facts 18823 1726855051.99514: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855051.99593: _execute() done 18823 1726855051.99597: dumping result to json 18823 1726855051.99600: done dumping result, returning 18823 1726855051.99606: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-d391-077c-00000000053c] 18823 1726855051.99609: sending task result for task 0affcc66-ac2b-d391-077c-00000000053c 18823 1726855051.99683: done sending task result for task 0affcc66-ac2b-d391-077c-00000000053c 18823 1726855051.99686: WORKER PROCESS EXITING 18823 1726855051.99800: no more pending results, returning what we have 18823 1726855051.99805: in VariableManager get_vars() 18823 1726855051.99847: Calling all_inventory to load vars for managed_node2 18823 1726855051.99851: Calling groups_inventory to load vars for managed_node2 18823 1726855051.99855: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855051.99872: Calling all_plugins_play to load vars for managed_node2 18823 1726855051.99875: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855051.99879: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.02222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.04403: done with get_vars() 18823 1726855052.04437: variable 'ansible_search_path' from source: unknown 18823 1726855052.04438: variable 'ansible_search_path' from source: unknown 18823 1726855052.04477: we have included files to process 18823 1726855052.04478: generating all_blocks data 18823 1726855052.04479: done generating all_blocks data 18823 1726855052.04481: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18823 1726855052.04482: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18823 1726855052.04484: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18823 1726855052.04684: done processing included file 18823 1726855052.04686: iterating over new_blocks loaded from include file 18823 1726855052.04690: in VariableManager get_vars() 18823 1726855052.04704: done with get_vars() 18823 1726855052.04706: filtering new block on tags 18823 1726855052.04721: done filtering new block on tags 18823 1726855052.04723: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 18823 1726855052.04728: extending task lists for all hosts with included blocks 18823 1726855052.04844: done extending task lists 18823 1726855052.04845: done processing included files 18823 1726855052.04846: results queue empty 18823 1726855052.04847: checking for any_errors_fatal 18823 1726855052.04850: done checking for any_errors_fatal 18823 1726855052.04851: checking for max_fail_percentage 18823 1726855052.04852: done checking for max_fail_percentage 18823 1726855052.04853: checking to see if all hosts have failed and the running result is not ok 18823 1726855052.04853: done checking to see if all hosts have failed 18823 1726855052.04854: getting the remaining hosts for this loop 18823 1726855052.04855: done getting the remaining hosts for this loop 18823 1726855052.04858: getting the next task for host managed_node2 18823 1726855052.04862: done getting next task for host managed_node2 18823 1726855052.04875: ^ task is: TASK: Get stat for interface {{ interface }} 18823 1726855052.04878: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855052.04881: getting variables 18823 1726855052.04919: in VariableManager get_vars() 18823 1726855052.04929: Calling all_inventory to load vars for managed_node2 18823 1726855052.04931: Calling groups_inventory to load vars for managed_node2 18823 1726855052.04939: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855052.04961: Calling all_plugins_play to load vars for managed_node2 18823 1726855052.04965: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855052.04968: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.06252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.08157: done with get_vars() 18823 1726855052.08181: done getting variables 18823 1726855052.08354: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:57:32 -0400 (0:00:00.103) 0:00:43.735 ****** 18823 1726855052.08386: entering _queue_task() for managed_node2/stat 18823 1726855052.08861: worker is 1 (out of 1 available) 18823 1726855052.08873: exiting _queue_task() for managed_node2/stat 18823 1726855052.08885: done queuing things up, now waiting for results queue to drain 18823 1726855052.08886: waiting for pending results... 18823 1726855052.09303: running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 18823 1726855052.09310: in run() - task 0affcc66-ac2b-d391-077c-000000000554 18823 1726855052.09313: variable 'ansible_search_path' from source: unknown 18823 1726855052.09316: variable 'ansible_search_path' from source: unknown 18823 1726855052.09357: calling self._execute() 18823 1726855052.09453: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.09464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.09477: variable 'omit' from source: magic vars 18823 1726855052.09845: variable 'ansible_distribution_major_version' from source: facts 18823 1726855052.09862: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855052.09872: variable 'omit' from source: magic vars 18823 1726855052.09932: variable 'omit' from source: magic vars 18823 1726855052.10034: variable 'interface' from source: set_fact 18823 1726855052.10056: variable 'omit' from source: magic vars 18823 1726855052.10192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855052.10195: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855052.10198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855052.10200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855052.10201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855052.10228: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855052.10236: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.10243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.10346: Set connection var ansible_timeout to 10 18823 1726855052.10357: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855052.10363: Set connection var ansible_shell_type to sh 18823 1726855052.10372: Set connection var ansible_shell_executable to /bin/sh 18823 1726855052.10384: Set connection var ansible_connection to ssh 18823 1726855052.10395: Set connection var ansible_pipelining to False 18823 1726855052.10428: variable 'ansible_shell_executable' from source: unknown 18823 1726855052.10436: variable 'ansible_connection' from source: unknown 18823 1726855052.10442: variable 'ansible_module_compression' from source: unknown 18823 1726855052.10447: variable 'ansible_shell_type' from source: unknown 18823 1726855052.10453: variable 'ansible_shell_executable' from source: unknown 18823 1726855052.10459: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.10466: variable 'ansible_pipelining' from source: unknown 18823 1726855052.10472: variable 'ansible_timeout' from source: unknown 18823 1726855052.10479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.10995: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18823 1726855052.11000: variable 'omit' from source: magic vars 18823 1726855052.11002: starting attempt loop 18823 1726855052.11007: running the handler 18823 1726855052.11009: _low_level_execute_command(): starting 18823 1726855052.11011: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855052.12106: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855052.12240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.12268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.12397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.14134: stdout chunk (state=3): >>>/root <<< 18823 1726855052.14240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.14318: stderr chunk (state=3): >>><<< 18823 1726855052.14335: stdout chunk (state=3): >>><<< 18823 1726855052.14367: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855052.14389: _low_level_execute_command(): starting 18823 1726855052.14485: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326 `" && echo ansible-tmp-1726855052.14374-20878-184698513511326="` echo /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326 `" ) && sleep 0' 18823 1726855052.15035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855052.15044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855052.15055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855052.15070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855052.15093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855052.15096: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855052.15135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.15145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855052.15148: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855052.15150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18823 1726855052.15215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.15271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.15334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.17269: stdout chunk (state=3): >>>ansible-tmp-1726855052.14374-20878-184698513511326=/root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326 <<< 18823 1726855052.17444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.17448: stdout chunk (state=3): >>><<< 18823 1726855052.17451: stderr chunk (state=3): >>><<< 18823 1726855052.17606: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855052.14374-20878-184698513511326=/root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855052.17610: variable 'ansible_module_compression' from source: unknown 18823 1726855052.17613: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18823 1726855052.17665: variable 'ansible_facts' from source: unknown 18823 1726855052.17763: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/AnsiballZ_stat.py 18823 1726855052.17950: Sending initial data 18823 1726855052.17959: Sent initial data (151 bytes) 18823 1726855052.18670: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855052.18685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855052.18816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.18842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.18944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.20639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855052.20711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855052.20779: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpevc8394a /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/AnsiballZ_stat.py <<< 18823 1726855052.20782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/AnsiballZ_stat.py" <<< 18823 1726855052.20846: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpevc8394a" to remote "/root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/AnsiballZ_stat.py" <<< 18823 1726855052.21798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.21802: stdout chunk (state=3): >>><<< 18823 1726855052.21804: stderr chunk (state=3): >>><<< 18823 1726855052.21811: done transferring module to remote 18823 1726855052.21826: _low_level_execute_command(): starting 18823 1726855052.21842: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/ /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/AnsiballZ_stat.py && sleep 0' 18823 1726855052.22507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.22575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855052.22615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.22650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.22722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.24660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.24665: stderr chunk (state=3): >>><<< 18823 1726855052.24668: stdout chunk (state=3): >>><<< 18823 1726855052.24670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855052.24672: _low_level_execute_command(): starting 18823 1726855052.24674: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/AnsiballZ_stat.py && sleep 0' 18823 1726855052.25409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.25430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18823 1726855052.25528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.25711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.25808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.41011: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18823 1726855052.42101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.42126: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 18823 1726855052.42174: stderr chunk (state=3): >>><<< 18823 1726855052.42191: stdout chunk (state=3): >>><<< 18823 1726855052.42223: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855052.42255: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855052.42268: _low_level_execute_command(): starting 18823 1726855052.42277: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855052.14374-20878-184698513511326/ > /dev/null 2>&1 && sleep 0' 18823 1726855052.42904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855052.42920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855052.42935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855052.42960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855052.42999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.43088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.43123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.43229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.45074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.45144: stderr chunk (state=3): >>><<< 18823 1726855052.45147: stdout chunk (state=3): >>><<< 18823 1726855052.45165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855052.45171: handler run complete 18823 1726855052.45194: attempt loop complete, returning result 18823 1726855052.45197: _execute() done 18823 1726855052.45199: dumping result to json 18823 1726855052.45256: done dumping result, returning 18823 1726855052.45259: done running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 [0affcc66-ac2b-d391-077c-000000000554] 18823 1726855052.45262: sending task result for task 0affcc66-ac2b-d391-077c-000000000554 18823 1726855052.45339: done sending task result for task 0affcc66-ac2b-d391-077c-000000000554 18823 1726855052.45342: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 18823 1726855052.45424: no more pending results, returning what we have 18823 1726855052.45429: results queue empty 18823 1726855052.45430: checking for any_errors_fatal 18823 1726855052.45432: done checking for any_errors_fatal 18823 1726855052.45432: checking for max_fail_percentage 18823 1726855052.45434: done checking for max_fail_percentage 18823 1726855052.45435: checking to see if all hosts have failed and the running result is not ok 18823 1726855052.45436: done checking to see if all hosts have failed 18823 1726855052.45436: getting the remaining hosts for this loop 18823 1726855052.45438: done getting the remaining hosts for this loop 18823 1726855052.45442: getting the next task for host managed_node2 18823 1726855052.45449: done getting next task for host managed_node2 18823 1726855052.45452: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 18823 1726855052.45454: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855052.45459: getting variables 18823 1726855052.45460: in VariableManager get_vars() 18823 1726855052.45631: Calling all_inventory to load vars for managed_node2 18823 1726855052.45634: Calling groups_inventory to load vars for managed_node2 18823 1726855052.45638: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855052.45649: Calling all_plugins_play to load vars for managed_node2 18823 1726855052.45652: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855052.45656: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.47452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.49518: done with get_vars() 18823 1726855052.49548: done getting variables 18823 1726855052.49605: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18823 1726855052.49715: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:57:32 -0400 (0:00:00.413) 0:00:44.149 ****** 18823 1726855052.49743: entering _queue_task() for managed_node2/assert 18823 1726855052.50074: worker is 1 (out of 1 available) 18823 1726855052.50091: exiting _queue_task() for managed_node2/assert 18823 1726855052.50105: done queuing things up, now waiting for results queue to drain 18823 1726855052.50106: waiting for pending results... 18823 1726855052.50475: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'lsr27' 18823 1726855052.50485: in run() - task 0affcc66-ac2b-d391-077c-00000000053d 18823 1726855052.50526: variable 'ansible_search_path' from source: unknown 18823 1726855052.50530: variable 'ansible_search_path' from source: unknown 18823 1726855052.50544: calling self._execute() 18823 1726855052.50660: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.50663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.50666: variable 'omit' from source: magic vars 18823 1726855052.51018: variable 'ansible_distribution_major_version' from source: facts 18823 1726855052.51056: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855052.51059: variable 'omit' from source: magic vars 18823 1726855052.51080: variable 'omit' from source: magic vars 18823 1726855052.51180: variable 'interface' from source: set_fact 18823 1726855052.51199: variable 'omit' from source: magic vars 18823 1726855052.51239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855052.51278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855052.51299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855052.51317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855052.51328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855052.51358: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855052.51361: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.51372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.51493: Set connection var ansible_timeout to 10 18823 1726855052.51501: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855052.51510: Set connection var ansible_shell_type to sh 18823 1726855052.51514: Set connection var ansible_shell_executable to /bin/sh 18823 1726855052.51516: Set connection var ansible_connection to ssh 18823 1726855052.51518: Set connection var ansible_pipelining to False 18823 1726855052.51614: variable 'ansible_shell_executable' from source: unknown 18823 1726855052.51619: variable 'ansible_connection' from source: unknown 18823 1726855052.51622: variable 'ansible_module_compression' from source: unknown 18823 1726855052.51624: variable 'ansible_shell_type' from source: unknown 18823 1726855052.51626: variable 'ansible_shell_executable' from source: unknown 18823 1726855052.51628: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.51630: variable 'ansible_pipelining' from source: unknown 18823 1726855052.51632: variable 'ansible_timeout' from source: unknown 18823 1726855052.51634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.51723: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855052.51726: variable 'omit' from source: magic vars 18823 1726855052.51729: starting attempt loop 18823 1726855052.51732: running the handler 18823 1726855052.51860: variable 'interface_stat' from source: set_fact 18823 1726855052.51893: Evaluated conditional (not interface_stat.stat.exists): True 18823 1726855052.51896: handler run complete 18823 1726855052.51899: attempt loop complete, returning result 18823 1726855052.51901: _execute() done 18823 1726855052.51903: dumping result to json 18823 1726855052.51909: done dumping result, returning 18823 1726855052.51912: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'lsr27' [0affcc66-ac2b-d391-077c-00000000053d] 18823 1726855052.51943: sending task result for task 0affcc66-ac2b-d391-077c-00000000053d 18823 1726855052.52109: done sending task result for task 0affcc66-ac2b-d391-077c-00000000053d 18823 1726855052.52113: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18823 1726855052.52162: no more pending results, returning what we have 18823 1726855052.52166: results queue empty 18823 1726855052.52167: checking for any_errors_fatal 18823 1726855052.52176: done checking for any_errors_fatal 18823 1726855052.52177: checking for max_fail_percentage 18823 1726855052.52178: done checking for max_fail_percentage 18823 1726855052.52179: checking to see if all hosts have failed and the running result is not ok 18823 1726855052.52180: done checking to see if all hosts have failed 18823 1726855052.52181: getting the remaining hosts for this loop 18823 1726855052.52182: done getting the remaining hosts for this loop 18823 1726855052.52186: getting the next task for host managed_node2 18823 1726855052.52197: done getting next task for host managed_node2 18823 1726855052.52200: ^ task is: TASK: meta (flush_handlers) 18823 1726855052.52202: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855052.52205: getting variables 18823 1726855052.52207: in VariableManager get_vars() 18823 1726855052.52235: Calling all_inventory to load vars for managed_node2 18823 1726855052.52238: Calling groups_inventory to load vars for managed_node2 18823 1726855052.52242: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855052.52253: Calling all_plugins_play to load vars for managed_node2 18823 1726855052.52257: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855052.52260: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.53893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.55394: done with get_vars() 18823 1726855052.55418: done getting variables 18823 1726855052.55491: in VariableManager get_vars() 18823 1726855052.55502: Calling all_inventory to load vars for managed_node2 18823 1726855052.55504: Calling groups_inventory to load vars for managed_node2 18823 1726855052.55507: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855052.55512: Calling all_plugins_play to load vars for managed_node2 18823 1726855052.55514: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855052.55517: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.56628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.58163: done with get_vars() 18823 1726855052.58192: done queuing things up, now waiting for results queue to drain 18823 1726855052.58194: results queue empty 18823 1726855052.58195: checking for any_errors_fatal 18823 1726855052.58197: done checking for any_errors_fatal 18823 1726855052.58198: checking for max_fail_percentage 18823 1726855052.58200: done checking for max_fail_percentage 18823 1726855052.58200: checking to see if all hosts have failed and the running result is not ok 18823 1726855052.58201: done checking to see if all hosts have failed 18823 1726855052.58207: getting the remaining hosts for this loop 18823 1726855052.58208: done getting the remaining hosts for this loop 18823 1726855052.58212: getting the next task for host managed_node2 18823 1726855052.58215: done getting next task for host managed_node2 18823 1726855052.58217: ^ task is: TASK: meta (flush_handlers) 18823 1726855052.58219: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855052.58221: getting variables 18823 1726855052.58223: in VariableManager get_vars() 18823 1726855052.58232: Calling all_inventory to load vars for managed_node2 18823 1726855052.58234: Calling groups_inventory to load vars for managed_node2 18823 1726855052.58236: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855052.58241: Calling all_plugins_play to load vars for managed_node2 18823 1726855052.58243: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855052.58245: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.59197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.60044: done with get_vars() 18823 1726855052.60058: done getting variables 18823 1726855052.60094: in VariableManager get_vars() 18823 1726855052.60100: Calling all_inventory to load vars for managed_node2 18823 1726855052.60101: Calling groups_inventory to load vars for managed_node2 18823 1726855052.60103: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855052.60106: Calling all_plugins_play to load vars for managed_node2 18823 1726855052.60108: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855052.60110: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.60842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.62175: done with get_vars() 18823 1726855052.62196: done queuing things up, now waiting for results queue to drain 18823 1726855052.62198: results queue empty 18823 1726855052.62198: checking for any_errors_fatal 18823 1726855052.62199: done checking for any_errors_fatal 18823 1726855052.62199: checking for max_fail_percentage 18823 1726855052.62200: done checking for max_fail_percentage 18823 1726855052.62201: checking to see if all hosts have failed and the running result is not ok 18823 1726855052.62201: done checking to see if all hosts have failed 18823 1726855052.62202: getting the remaining hosts for this loop 18823 1726855052.62202: done getting the remaining hosts for this loop 18823 1726855052.62204: getting the next task for host managed_node2 18823 1726855052.62207: done getting next task for host managed_node2 18823 1726855052.62207: ^ task is: None 18823 1726855052.62208: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855052.62209: done queuing things up, now waiting for results queue to drain 18823 1726855052.62210: results queue empty 18823 1726855052.62210: checking for any_errors_fatal 18823 1726855052.62211: done checking for any_errors_fatal 18823 1726855052.62211: checking for max_fail_percentage 18823 1726855052.62211: done checking for max_fail_percentage 18823 1726855052.62212: checking to see if all hosts have failed and the running result is not ok 18823 1726855052.62212: done checking to see if all hosts have failed 18823 1726855052.62213: getting the next task for host managed_node2 18823 1726855052.62214: done getting next task for host managed_node2 18823 1726855052.62215: ^ task is: None 18823 1726855052.62216: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855052.62248: in VariableManager get_vars() 18823 1726855052.62259: done with get_vars() 18823 1726855052.62263: in VariableManager get_vars() 18823 1726855052.62268: done with get_vars() 18823 1726855052.62271: variable 'omit' from source: magic vars 18823 1726855052.62297: in VariableManager get_vars() 18823 1726855052.62304: done with get_vars() 18823 1726855052.62318: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 18823 1726855052.62433: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18823 1726855052.62453: getting the remaining hosts for this loop 18823 1726855052.62454: done getting the remaining hosts for this loop 18823 1726855052.62456: getting the next task for host managed_node2 18823 1726855052.62457: done getting next task for host managed_node2 18823 1726855052.62459: ^ task is: TASK: Gathering Facts 18823 1726855052.62460: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855052.62461: getting variables 18823 1726855052.62462: in VariableManager get_vars() 18823 1726855052.62467: Calling all_inventory to load vars for managed_node2 18823 1726855052.62469: Calling groups_inventory to load vars for managed_node2 18823 1726855052.62470: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855052.62474: Calling all_plugins_play to load vars for managed_node2 18823 1726855052.62475: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855052.62477: Calling groups_plugins_play to load vars for managed_node2 18823 1726855052.63137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855052.64325: done with get_vars() 18823 1726855052.64347: done getting variables 18823 1726855052.64411: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 13:57:32 -0400 (0:00:00.146) 0:00:44.296 ****** 18823 1726855052.64437: entering _queue_task() for managed_node2/gather_facts 18823 1726855052.64793: worker is 1 (out of 1 available) 18823 1726855052.64806: exiting _queue_task() for managed_node2/gather_facts 18823 1726855052.64818: done queuing things up, now waiting for results queue to drain 18823 1726855052.64819: waiting for pending results... 18823 1726855052.65123: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18823 1726855052.65211: in run() - task 0affcc66-ac2b-d391-077c-00000000056d 18823 1726855052.65241: variable 'ansible_search_path' from source: unknown 18823 1726855052.65282: calling self._execute() 18823 1726855052.65367: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.65592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.65595: variable 'omit' from source: magic vars 18823 1726855052.65798: variable 'ansible_distribution_major_version' from source: facts 18823 1726855052.65822: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855052.65841: variable 'omit' from source: magic vars 18823 1726855052.65872: variable 'omit' from source: magic vars 18823 1726855052.65917: variable 'omit' from source: magic vars 18823 1726855052.65973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855052.66017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855052.66045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855052.66075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855052.66093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855052.66129: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855052.66138: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.66145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.66335: Set connection var ansible_timeout to 10 18823 1726855052.66349: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855052.66368: Set connection var ansible_shell_type to sh 18823 1726855052.66391: Set connection var ansible_shell_executable to /bin/sh 18823 1726855052.66406: Set connection var ansible_connection to ssh 18823 1726855052.66430: Set connection var ansible_pipelining to False 18823 1726855052.66468: variable 'ansible_shell_executable' from source: unknown 18823 1726855052.66472: variable 'ansible_connection' from source: unknown 18823 1726855052.66475: variable 'ansible_module_compression' from source: unknown 18823 1726855052.66478: variable 'ansible_shell_type' from source: unknown 18823 1726855052.66480: variable 'ansible_shell_executable' from source: unknown 18823 1726855052.66482: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855052.66511: variable 'ansible_pipelining' from source: unknown 18823 1726855052.66515: variable 'ansible_timeout' from source: unknown 18823 1726855052.66517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855052.66660: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855052.66667: variable 'omit' from source: magic vars 18823 1726855052.66677: starting attempt loop 18823 1726855052.66680: running the handler 18823 1726855052.66694: variable 'ansible_facts' from source: unknown 18823 1726855052.66715: _low_level_execute_command(): starting 18823 1726855052.66724: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855052.67431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855052.67436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.67483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.67570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.69338: stdout chunk (state=3): >>>/root <<< 18823 1726855052.69443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.69507: stderr chunk (state=3): >>><<< 18823 1726855052.69531: stdout chunk (state=3): >>><<< 18823 1726855052.69596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855052.69600: _low_level_execute_command(): starting 18823 1726855052.69603: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093 `" && echo ansible-tmp-1726855052.6955714-20912-81380738736093="` echo /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093 `" ) && sleep 0' 18823 1726855052.70266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855052.70370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.70406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855052.70426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.70449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.70562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.72569: stdout chunk (state=3): >>>ansible-tmp-1726855052.6955714-20912-81380738736093=/root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093 <<< 18823 1726855052.72665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.72702: stderr chunk (state=3): >>><<< 18823 1726855052.72708: stdout chunk (state=3): >>><<< 18823 1726855052.72759: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855052.6955714-20912-81380738736093=/root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855052.72781: variable 'ansible_module_compression' from source: unknown 18823 1726855052.72870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18823 1726855052.72933: variable 'ansible_facts' from source: unknown 18823 1726855052.73197: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/AnsiballZ_setup.py 18823 1726855052.73316: Sending initial data 18823 1726855052.73425: Sent initial data (153 bytes) 18823 1726855052.73962: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855052.74002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 18823 1726855052.74084: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.74110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.74132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.74225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.75827: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855052.75893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855052.75960: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpzcjz702z /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/AnsiballZ_setup.py <<< 18823 1726855052.75972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/AnsiballZ_setup.py" <<< 18823 1726855052.76034: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpzcjz702z" to remote "/root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/AnsiballZ_setup.py" <<< 18823 1726855052.77285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.77318: stderr chunk (state=3): >>><<< 18823 1726855052.77426: stdout chunk (state=3): >>><<< 18823 1726855052.77429: done transferring module to remote 18823 1726855052.77431: _low_level_execute_command(): starting 18823 1726855052.77434: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/ /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/AnsiballZ_setup.py && sleep 0' 18823 1726855052.77833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855052.77846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855052.77857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.77902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855052.77923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.77986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855052.79901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855052.79909: stdout chunk (state=3): >>><<< 18823 1726855052.79912: stderr chunk (state=3): >>><<< 18823 1726855052.79928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855052.80010: _low_level_execute_command(): starting 18823 1726855052.80014: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/AnsiballZ_setup.py && sleep 0' 18823 1726855052.80622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855052.80654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855052.80657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855052.80700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855052.80785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855053.43559: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "33", "epoch": "1726855053", "epoch_int": "1726855053", "date": "2024-09-20", "time": "13:57:33", "iso8601_micro": "2024-09-20T17:57:33.081342Z", "iso8601": "2024-09-20T17:57:33Z", "iso8601_basic": "20240920T135733081342", "iso8601_basic_short": "20240920T135733", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.53955078125, "5m": 0.4248046875, "15m": 0.2216796875}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3286, "used": 245}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 836, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794570240, "block_size": 4096, "block_total": 65519099, "block_available": 63914690, "block_used": 1604409, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18823 1726855053.45677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855053.45681: stdout chunk (state=3): >>><<< 18823 1726855053.45684: stderr chunk (state=3): >>><<< 18823 1726855053.45972: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "57", "second": "33", "epoch": "1726855053", "epoch_int": "1726855053", "date": "2024-09-20", "time": "13:57:33", "iso8601_micro": "2024-09-20T17:57:33.081342Z", "iso8601": "2024-09-20T17:57:33Z", "iso8601_basic": "20240920T135733081342", "iso8601_basic_short": "20240920T135733", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.53955078125, "5m": 0.4248046875, "15m": 0.2216796875}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3286, "used": 245}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 836, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794570240, "block_size": 4096, "block_total": 65519099, "block_available": 63914690, "block_used": 1604409, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855053.46474: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855053.46535: _low_level_execute_command(): starting 18823 1726855053.46602: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855052.6955714-20912-81380738736093/ > /dev/null 2>&1 && sleep 0' 18823 1726855053.47328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855053.47346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855053.47359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855053.47376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855053.47395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855053.47407: stderr chunk (state=3): >>>debug2: match not found <<< 18823 1726855053.47452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855053.47527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855053.47552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855053.47684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855053.47792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855053.49802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855053.49806: stdout chunk (state=3): >>><<< 18823 1726855053.49809: stderr chunk (state=3): >>><<< 18823 1726855053.49811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855053.49814: handler run complete 18823 1726855053.50167: variable 'ansible_facts' from source: unknown 18823 1726855053.50336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855053.51005: variable 'ansible_facts' from source: unknown 18823 1726855053.51124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855053.51360: attempt loop complete, returning result 18823 1726855053.51404: _execute() done 18823 1726855053.51416: dumping result to json 18823 1726855053.51467: done dumping result, returning 18823 1726855053.51593: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-d391-077c-00000000056d] 18823 1726855053.51596: sending task result for task 0affcc66-ac2b-d391-077c-00000000056d ok: [managed_node2] 18823 1726855053.52387: no more pending results, returning what we have 18823 1726855053.52392: results queue empty 18823 1726855053.52393: checking for any_errors_fatal 18823 1726855053.52395: done checking for any_errors_fatal 18823 1726855053.52395: checking for max_fail_percentage 18823 1726855053.52397: done checking for max_fail_percentage 18823 1726855053.52398: checking to see if all hosts have failed and the running result is not ok 18823 1726855053.52399: done checking to see if all hosts have failed 18823 1726855053.52399: getting the remaining hosts for this loop 18823 1726855053.52401: done getting the remaining hosts for this loop 18823 1726855053.52404: getting the next task for host managed_node2 18823 1726855053.52409: done getting next task for host managed_node2 18823 1726855053.52411: ^ task is: TASK: meta (flush_handlers) 18823 1726855053.52413: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855053.52417: getting variables 18823 1726855053.52418: in VariableManager get_vars() 18823 1726855053.52439: Calling all_inventory to load vars for managed_node2 18823 1726855053.52441: Calling groups_inventory to load vars for managed_node2 18823 1726855053.52444: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855053.52451: done sending task result for task 0affcc66-ac2b-d391-077c-00000000056d 18823 1726855053.52453: WORKER PROCESS EXITING 18823 1726855053.52462: Calling all_plugins_play to load vars for managed_node2 18823 1726855053.52466: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855053.52468: Calling groups_plugins_play to load vars for managed_node2 18823 1726855053.60293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855053.61879: done with get_vars() 18823 1726855053.61908: done getting variables 18823 1726855053.61964: in VariableManager get_vars() 18823 1726855053.61977: Calling all_inventory to load vars for managed_node2 18823 1726855053.62003: Calling groups_inventory to load vars for managed_node2 18823 1726855053.62007: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855053.62012: Calling all_plugins_play to load vars for managed_node2 18823 1726855053.62015: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855053.62018: Calling groups_plugins_play to load vars for managed_node2 18823 1726855053.63198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855053.65135: done with get_vars() 18823 1726855053.65161: done queuing things up, now waiting for results queue to drain 18823 1726855053.65164: results queue empty 18823 1726855053.65169: checking for any_errors_fatal 18823 1726855053.65173: done checking for any_errors_fatal 18823 1726855053.65174: checking for max_fail_percentage 18823 1726855053.65175: done checking for max_fail_percentage 18823 1726855053.65175: checking to see if all hosts have failed and the running result is not ok 18823 1726855053.65176: done checking to see if all hosts have failed 18823 1726855053.65177: getting the remaining hosts for this loop 18823 1726855053.65178: done getting the remaining hosts for this loop 18823 1726855053.65181: getting the next task for host managed_node2 18823 1726855053.65184: done getting next task for host managed_node2 18823 1726855053.65188: ^ task is: TASK: Verify network state restored to default 18823 1726855053.65190: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855053.65192: getting variables 18823 1726855053.65193: in VariableManager get_vars() 18823 1726855053.65201: Calling all_inventory to load vars for managed_node2 18823 1726855053.65203: Calling groups_inventory to load vars for managed_node2 18823 1726855053.65206: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855053.65211: Calling all_plugins_play to load vars for managed_node2 18823 1726855053.65213: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855053.65216: Calling groups_plugins_play to load vars for managed_node2 18823 1726855053.66530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855053.68425: done with get_vars() 18823 1726855053.68495: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 13:57:33 -0400 (0:00:01.042) 0:00:45.339 ****** 18823 1726855053.68735: entering _queue_task() for managed_node2/include_tasks 18823 1726855053.69558: worker is 1 (out of 1 available) 18823 1726855053.69569: exiting _queue_task() for managed_node2/include_tasks 18823 1726855053.69580: done queuing things up, now waiting for results queue to drain 18823 1726855053.69581: waiting for pending results... 18823 1726855053.69717: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 18823 1726855053.69949: in run() - task 0affcc66-ac2b-d391-077c-000000000078 18823 1726855053.69953: variable 'ansible_search_path' from source: unknown 18823 1726855053.69960: calling self._execute() 18823 1726855053.70066: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855053.70078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855053.70097: variable 'omit' from source: magic vars 18823 1726855053.70517: variable 'ansible_distribution_major_version' from source: facts 18823 1726855053.70535: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855053.70601: _execute() done 18823 1726855053.70607: dumping result to json 18823 1726855053.70610: done dumping result, returning 18823 1726855053.70613: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0affcc66-ac2b-d391-077c-000000000078] 18823 1726855053.70615: sending task result for task 0affcc66-ac2b-d391-077c-000000000078 18823 1726855053.70821: no more pending results, returning what we have 18823 1726855053.70827: in VariableManager get_vars() 18823 1726855053.70863: Calling all_inventory to load vars for managed_node2 18823 1726855053.70866: Calling groups_inventory to load vars for managed_node2 18823 1726855053.70870: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855053.70885: Calling all_plugins_play to load vars for managed_node2 18823 1726855053.70890: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855053.70894: Calling groups_plugins_play to load vars for managed_node2 18823 1726855053.71530: done sending task result for task 0affcc66-ac2b-d391-077c-000000000078 18823 1726855053.71534: WORKER PROCESS EXITING 18823 1726855053.72372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855053.73257: done with get_vars() 18823 1726855053.73273: variable 'ansible_search_path' from source: unknown 18823 1726855053.73285: we have included files to process 18823 1726855053.73286: generating all_blocks data 18823 1726855053.73289: done generating all_blocks data 18823 1726855053.73289: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18823 1726855053.73290: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18823 1726855053.73292: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18823 1726855053.73560: done processing included file 18823 1726855053.73561: iterating over new_blocks loaded from include file 18823 1726855053.73562: in VariableManager get_vars() 18823 1726855053.73570: done with get_vars() 18823 1726855053.73571: filtering new block on tags 18823 1726855053.73581: done filtering new block on tags 18823 1726855053.73583: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 18823 1726855053.73586: extending task lists for all hosts with included blocks 18823 1726855053.73617: done extending task lists 18823 1726855053.73619: done processing included files 18823 1726855053.73620: results queue empty 18823 1726855053.73621: checking for any_errors_fatal 18823 1726855053.73623: done checking for any_errors_fatal 18823 1726855053.73623: checking for max_fail_percentage 18823 1726855053.73624: done checking for max_fail_percentage 18823 1726855053.73625: checking to see if all hosts have failed and the running result is not ok 18823 1726855053.73626: done checking to see if all hosts have failed 18823 1726855053.73626: getting the remaining hosts for this loop 18823 1726855053.73628: done getting the remaining hosts for this loop 18823 1726855053.73630: getting the next task for host managed_node2 18823 1726855053.73634: done getting next task for host managed_node2 18823 1726855053.73636: ^ task is: TASK: Check routes and DNS 18823 1726855053.73638: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855053.73640: getting variables 18823 1726855053.73641: in VariableManager get_vars() 18823 1726855053.73647: Calling all_inventory to load vars for managed_node2 18823 1726855053.73649: Calling groups_inventory to load vars for managed_node2 18823 1726855053.73650: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855053.73654: Calling all_plugins_play to load vars for managed_node2 18823 1726855053.73656: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855053.73657: Calling groups_plugins_play to load vars for managed_node2 18823 1726855053.74782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855053.75660: done with get_vars() 18823 1726855053.75675: done getting variables 18823 1726855053.75707: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:57:33 -0400 (0:00:00.069) 0:00:45.408 ****** 18823 1726855053.75729: entering _queue_task() for managed_node2/shell 18823 1726855053.75983: worker is 1 (out of 1 available) 18823 1726855053.75998: exiting _queue_task() for managed_node2/shell 18823 1726855053.76009: done queuing things up, now waiting for results queue to drain 18823 1726855053.76010: waiting for pending results... 18823 1726855053.76201: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 18823 1726855053.76298: in run() - task 0affcc66-ac2b-d391-077c-00000000057e 18823 1726855053.76324: variable 'ansible_search_path' from source: unknown 18823 1726855053.76327: variable 'ansible_search_path' from source: unknown 18823 1726855053.76363: calling self._execute() 18823 1726855053.76438: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855053.76443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855053.76457: variable 'omit' from source: magic vars 18823 1726855053.76823: variable 'ansible_distribution_major_version' from source: facts 18823 1726855053.76979: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855053.76982: variable 'omit' from source: magic vars 18823 1726855053.76985: variable 'omit' from source: magic vars 18823 1726855053.76990: variable 'omit' from source: magic vars 18823 1726855053.77012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855053.77053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855053.77085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855053.77111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855053.77128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855053.77170: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855053.77188: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855053.77206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855053.77336: Set connection var ansible_timeout to 10 18823 1726855053.77363: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855053.77371: Set connection var ansible_shell_type to sh 18823 1726855053.77375: Set connection var ansible_shell_executable to /bin/sh 18823 1726855053.77377: Set connection var ansible_connection to ssh 18823 1726855053.77381: Set connection var ansible_pipelining to False 18823 1726855053.77425: variable 'ansible_shell_executable' from source: unknown 18823 1726855053.77436: variable 'ansible_connection' from source: unknown 18823 1726855053.77449: variable 'ansible_module_compression' from source: unknown 18823 1726855053.77457: variable 'ansible_shell_type' from source: unknown 18823 1726855053.77463: variable 'ansible_shell_executable' from source: unknown 18823 1726855053.77470: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855053.77477: variable 'ansible_pipelining' from source: unknown 18823 1726855053.77483: variable 'ansible_timeout' from source: unknown 18823 1726855053.77492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855053.77668: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855053.77684: variable 'omit' from source: magic vars 18823 1726855053.77759: starting attempt loop 18823 1726855053.77762: running the handler 18823 1726855053.77765: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855053.77767: _low_level_execute_command(): starting 18823 1726855053.77769: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855053.78399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855053.78420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855053.78434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855053.78495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855053.78502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855053.78577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855053.80281: stdout chunk (state=3): >>>/root <<< 18823 1726855053.80427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855053.80430: stderr chunk (state=3): >>><<< 18823 1726855053.80437: stdout chunk (state=3): >>><<< 18823 1726855053.80453: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855053.80472: _low_level_execute_command(): starting 18823 1726855053.80476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398 `" && echo ansible-tmp-1726855053.8045332-20960-88902198665398="` echo /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398 `" ) && sleep 0' 18823 1726855053.81169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855053.81227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855053.83176: stdout chunk (state=3): >>>ansible-tmp-1726855053.8045332-20960-88902198665398=/root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398 <<< 18823 1726855053.83307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855053.83352: stderr chunk (state=3): >>><<< 18823 1726855053.83355: stdout chunk (state=3): >>><<< 18823 1726855053.83594: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855053.8045332-20960-88902198665398=/root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855053.83599: variable 'ansible_module_compression' from source: unknown 18823 1726855053.83602: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855053.83606: variable 'ansible_facts' from source: unknown 18823 1726855053.83608: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/AnsiballZ_command.py 18823 1726855053.83738: Sending initial data 18823 1726855053.83801: Sent initial data (155 bytes) 18823 1726855053.84368: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855053.84377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855053.84383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 18823 1726855053.84409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855053.84413: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855053.84452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855053.84464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855053.84545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855053.86149: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 18823 1726855053.86153: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855053.86218: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855053.86286: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmpwlaq5a8w /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/AnsiballZ_command.py <<< 18823 1726855053.86295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/AnsiballZ_command.py" <<< 18823 1726855053.86357: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmpwlaq5a8w" to remote "/root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/AnsiballZ_command.py" <<< 18823 1726855053.86359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/AnsiballZ_command.py" <<< 18823 1726855053.86991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855053.87035: stderr chunk (state=3): >>><<< 18823 1726855053.87038: stdout chunk (state=3): >>><<< 18823 1726855053.87075: done transferring module to remote 18823 1726855053.87084: _low_level_execute_command(): starting 18823 1726855053.87091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/ /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/AnsiballZ_command.py && sleep 0' 18823 1726855053.87544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855053.87548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855053.87550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855053.87552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855053.87554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855053.87607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855053.87610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855053.87683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855053.89484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855053.89515: stderr chunk (state=3): >>><<< 18823 1726855053.89519: stdout chunk (state=3): >>><<< 18823 1726855053.89531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855053.89534: _low_level_execute_command(): starting 18823 1726855053.89538: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/AnsiballZ_command.py && sleep 0' 18823 1726855053.89961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855053.89964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 18823 1726855053.89966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855053.89968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855053.89970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855053.90022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855053.90025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855053.90111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.06532: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:d8:4c:fa:7a:71 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.178/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3298sec preferred_lft 3298sec\n inet6 fe80::d8:4cff:fefa:7a71/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.178 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.178 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:57:34.053350", "end": "2024-09-20 13:57:34.062072", "delta": "0:00:00.008722", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855054.08156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855054.08160: stdout chunk (state=3): >>><<< 18823 1726855054.08163: stderr chunk (state=3): >>><<< 18823 1726855054.08166: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:d8:4c:fa:7a:71 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.178/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3298sec preferred_lft 3298sec\n inet6 fe80::d8:4cff:fefa:7a71/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.178 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.178 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:57:34.053350", "end": "2024-09-20 13:57:34.062072", "delta": "0:00:00.008722", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855054.08179: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855054.08190: _low_level_execute_command(): starting 18823 1726855054.08195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855053.8045332-20960-88902198665398/ > /dev/null 2>&1 && sleep 0' 18823 1726855054.09563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855054.09567: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855054.09570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855054.09572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855054.09744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855054.09749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855054.09916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.12065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855054.12103: stderr chunk (state=3): >>><<< 18823 1726855054.12106: stdout chunk (state=3): >>><<< 18823 1726855054.12127: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855054.12134: handler run complete 18823 1726855054.12158: Evaluated conditional (False): False 18823 1726855054.12170: attempt loop complete, returning result 18823 1726855054.12172: _execute() done 18823 1726855054.12175: dumping result to json 18823 1726855054.12179: done dumping result, returning 18823 1726855054.12189: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0affcc66-ac2b-d391-077c-00000000057e] 18823 1726855054.12399: sending task result for task 0affcc66-ac2b-d391-077c-00000000057e 18823 1726855054.12711: done sending task result for task 0affcc66-ac2b-d391-077c-00000000057e 18823 1726855054.12714: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008722", "end": "2024-09-20 13:57:34.062072", "rc": 0, "start": "2024-09-20 13:57:34.053350" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:d8:4c:fa:7a:71 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.178/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3298sec preferred_lft 3298sec inet6 fe80::d8:4cff:fefa:7a71/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.178 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.178 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 18823 1726855054.12786: no more pending results, returning what we have 18823 1726855054.12792: results queue empty 18823 1726855054.12793: checking for any_errors_fatal 18823 1726855054.12795: done checking for any_errors_fatal 18823 1726855054.12796: checking for max_fail_percentage 18823 1726855054.12797: done checking for max_fail_percentage 18823 1726855054.12798: checking to see if all hosts have failed and the running result is not ok 18823 1726855054.12799: done checking to see if all hosts have failed 18823 1726855054.12800: getting the remaining hosts for this loop 18823 1726855054.12801: done getting the remaining hosts for this loop 18823 1726855054.12805: getting the next task for host managed_node2 18823 1726855054.12811: done getting next task for host managed_node2 18823 1726855054.12814: ^ task is: TASK: Verify DNS and network connectivity 18823 1726855054.12817: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855054.12821: getting variables 18823 1726855054.12822: in VariableManager get_vars() 18823 1726855054.12850: Calling all_inventory to load vars for managed_node2 18823 1726855054.12852: Calling groups_inventory to load vars for managed_node2 18823 1726855054.12861: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855054.12871: Calling all_plugins_play to load vars for managed_node2 18823 1726855054.12874: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855054.12876: Calling groups_plugins_play to load vars for managed_node2 18823 1726855054.15929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855054.19078: done with get_vars() 18823 1726855054.19113: done getting variables 18823 1726855054.19176: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:57:34 -0400 (0:00:00.434) 0:00:45.843 ****** 18823 1726855054.19211: entering _queue_task() for managed_node2/shell 18823 1726855054.19724: worker is 1 (out of 1 available) 18823 1726855054.19735: exiting _queue_task() for managed_node2/shell 18823 1726855054.19746: done queuing things up, now waiting for results queue to drain 18823 1726855054.19747: waiting for pending results... 18823 1726855054.19990: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 18823 1726855054.20118: in run() - task 0affcc66-ac2b-d391-077c-00000000057f 18823 1726855054.20193: variable 'ansible_search_path' from source: unknown 18823 1726855054.20196: variable 'ansible_search_path' from source: unknown 18823 1726855054.20200: calling self._execute() 18823 1726855054.20304: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855054.20321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855054.20337: variable 'omit' from source: magic vars 18823 1726855054.20766: variable 'ansible_distribution_major_version' from source: facts 18823 1726855054.20783: Evaluated conditional (ansible_distribution_major_version != '6'): True 18823 1726855054.20954: variable 'ansible_facts' from source: unknown 18823 1726855054.21755: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 18823 1726855054.21768: variable 'omit' from source: magic vars 18823 1726855054.21812: variable 'omit' from source: magic vars 18823 1726855054.21893: variable 'omit' from source: magic vars 18823 1726855054.21905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18823 1726855054.21942: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18823 1726855054.21974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18823 1726855054.21999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855054.22017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18823 1726855054.22063: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18823 1726855054.22066: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855054.22093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855054.22183: Set connection var ansible_timeout to 10 18823 1726855054.22199: Set connection var ansible_module_compression to ZIP_DEFLATED 18823 1726855054.22279: Set connection var ansible_shell_type to sh 18823 1726855054.22282: Set connection var ansible_shell_executable to /bin/sh 18823 1726855054.22285: Set connection var ansible_connection to ssh 18823 1726855054.22288: Set connection var ansible_pipelining to False 18823 1726855054.22293: variable 'ansible_shell_executable' from source: unknown 18823 1726855054.22295: variable 'ansible_connection' from source: unknown 18823 1726855054.22297: variable 'ansible_module_compression' from source: unknown 18823 1726855054.22299: variable 'ansible_shell_type' from source: unknown 18823 1726855054.22300: variable 'ansible_shell_executable' from source: unknown 18823 1726855054.22302: variable 'ansible_host' from source: host vars for 'managed_node2' 18823 1726855054.22304: variable 'ansible_pipelining' from source: unknown 18823 1726855054.22306: variable 'ansible_timeout' from source: unknown 18823 1726855054.22308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18823 1726855054.22453: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855054.22497: variable 'omit' from source: magic vars 18823 1726855054.22500: starting attempt loop 18823 1726855054.22502: running the handler 18823 1726855054.22507: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18823 1726855054.22592: _low_level_execute_command(): starting 18823 1726855054.22595: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18823 1726855054.23418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855054.23503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855054.23532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855054.23649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.25421: stdout chunk (state=3): >>>/root <<< 18823 1726855054.25424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855054.25695: stderr chunk (state=3): >>><<< 18823 1726855054.25698: stdout chunk (state=3): >>><<< 18823 1726855054.25702: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855054.25708: _low_level_execute_command(): starting 18823 1726855054.25712: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765 `" && echo ansible-tmp-1726855054.2560985-20978-270999491745765="` echo /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765 `" ) && sleep 0' 18823 1726855054.26430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855054.26444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18823 1726855054.26467: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855054.26517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855054.26530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855054.26616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.28610: stdout chunk (state=3): >>>ansible-tmp-1726855054.2560985-20978-270999491745765=/root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765 <<< 18823 1726855054.28836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855054.28840: stdout chunk (state=3): >>><<< 18823 1726855054.28843: stderr chunk (state=3): >>><<< 18823 1726855054.29294: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855054.2560985-20978-270999491745765=/root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855054.29298: variable 'ansible_module_compression' from source: unknown 18823 1726855054.29300: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18823q83f5450/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18823 1726855054.29302: variable 'ansible_facts' from source: unknown 18823 1726855054.29345: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/AnsiballZ_command.py 18823 1726855054.29611: Sending initial data 18823 1726855054.29621: Sent initial data (156 bytes) 18823 1726855054.30242: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855054.30308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855054.30365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855054.30381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855054.30415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855054.30535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.32237: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18823 1726855054.32325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18823 1726855054.32443: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18823q83f5450/tmp3d_g9hlm /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/AnsiballZ_command.py <<< 18823 1726855054.32446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/AnsiballZ_command.py" <<< 18823 1726855054.32508: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18823q83f5450/tmp3d_g9hlm" to remote "/root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/AnsiballZ_command.py" <<< 18823 1726855054.33702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855054.33860: stderr chunk (state=3): >>><<< 18823 1726855054.33863: stdout chunk (state=3): >>><<< 18823 1726855054.33882: done transferring module to remote 18823 1726855054.33896: _low_level_execute_command(): starting 18823 1726855054.33902: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/ /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/AnsiballZ_command.py && sleep 0' 18823 1726855054.34561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855054.34577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855054.34685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855054.34716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855054.34818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.36782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855054.36786: stdout chunk (state=3): >>><<< 18823 1726855054.36793: stderr chunk (state=3): >>><<< 18823 1726855054.36842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855054.36846: _low_level_execute_command(): starting 18823 1726855054.36849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/AnsiballZ_command.py && sleep 0' 18823 1726855054.37624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855054.37627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18823 1726855054.37630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18823 1726855054.37632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18823 1726855054.37634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 18823 1726855054.37659: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855054.37743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855054.37756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855054.38023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.75141: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3697 0 --:--:-- --:--:-- --:--:-- 3719\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2516 0 --:--:-- --:--:-- --:--:-- 2530", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:57:34.529277", "end": "2024-09-20 13:57:34.748996", "delta": "0:00:00.219719", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18823 1726855054.76713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 18823 1726855054.76773: stderr chunk (state=3): >>><<< 18823 1726855054.76934: stdout chunk (state=3): >>><<< 18823 1726855054.76939: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3697 0 --:--:-- --:--:-- --:--:-- 3719\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2516 0 --:--:-- --:--:-- --:--:-- 2530", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:57:34.529277", "end": "2024-09-20 13:57:34.748996", "delta": "0:00:00.219719", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 18823 1726855054.76950: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18823 1726855054.76953: _low_level_execute_command(): starting 18823 1726855054.76955: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855054.2560985-20978-270999491745765/ > /dev/null 2>&1 && sleep 0' 18823 1726855054.77519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18823 1726855054.77608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18823 1726855054.77637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 18823 1726855054.77652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18823 1726855054.77677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18823 1726855054.77784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18823 1726855054.79894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18823 1726855054.79898: stdout chunk (state=3): >>><<< 18823 1726855054.79900: stderr chunk (state=3): >>><<< 18823 1726855054.79902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18823 1726855054.79907: handler run complete 18823 1726855054.79908: Evaluated conditional (False): False 18823 1726855054.79910: attempt loop complete, returning result 18823 1726855054.79912: _execute() done 18823 1726855054.79913: dumping result to json 18823 1726855054.79915: done dumping result, returning 18823 1726855054.79916: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0affcc66-ac2b-d391-077c-00000000057f] 18823 1726855054.79918: sending task result for task 0affcc66-ac2b-d391-077c-00000000057f 18823 1726855054.79986: done sending task result for task 0affcc66-ac2b-d391-077c-00000000057f 18823 1726855054.79991: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.219719", "end": "2024-09-20 13:57:34.748996", "rc": 0, "start": "2024-09-20 13:57:34.529277" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3697 0 --:--:-- --:--:-- --:--:-- 3719 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2516 0 --:--:-- --:--:-- --:--:-- 2530 18823 1726855054.80063: no more pending results, returning what we have 18823 1726855054.80067: results queue empty 18823 1726855054.80068: checking for any_errors_fatal 18823 1726855054.80078: done checking for any_errors_fatal 18823 1726855054.80079: checking for max_fail_percentage 18823 1726855054.80080: done checking for max_fail_percentage 18823 1726855054.80081: checking to see if all hosts have failed and the running result is not ok 18823 1726855054.80082: done checking to see if all hosts have failed 18823 1726855054.80083: getting the remaining hosts for this loop 18823 1726855054.80084: done getting the remaining hosts for this loop 18823 1726855054.80090: getting the next task for host managed_node2 18823 1726855054.80098: done getting next task for host managed_node2 18823 1726855054.80106: ^ task is: TASK: meta (flush_handlers) 18823 1726855054.80108: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855054.80112: getting variables 18823 1726855054.80113: in VariableManager get_vars() 18823 1726855054.80144: Calling all_inventory to load vars for managed_node2 18823 1726855054.80147: Calling groups_inventory to load vars for managed_node2 18823 1726855054.80151: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855054.80163: Calling all_plugins_play to load vars for managed_node2 18823 1726855054.80166: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855054.80169: Calling groups_plugins_play to load vars for managed_node2 18823 1726855054.81711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855054.84737: done with get_vars() 18823 1726855054.84766: done getting variables 18823 1726855054.84841: in VariableManager get_vars() 18823 1726855054.84852: Calling all_inventory to load vars for managed_node2 18823 1726855054.84855: Calling groups_inventory to load vars for managed_node2 18823 1726855054.84863: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855054.84868: Calling all_plugins_play to load vars for managed_node2 18823 1726855054.84870: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855054.84873: Calling groups_plugins_play to load vars for managed_node2 18823 1726855054.86403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855054.88095: done with get_vars() 18823 1726855054.88138: done queuing things up, now waiting for results queue to drain 18823 1726855054.88141: results queue empty 18823 1726855054.88142: checking for any_errors_fatal 18823 1726855054.88146: done checking for any_errors_fatal 18823 1726855054.88146: checking for max_fail_percentage 18823 1726855054.88148: done checking for max_fail_percentage 18823 1726855054.88148: checking to see if all hosts have failed and the running result is not ok 18823 1726855054.88149: done checking to see if all hosts have failed 18823 1726855054.88150: getting the remaining hosts for this loop 18823 1726855054.88151: done getting the remaining hosts for this loop 18823 1726855054.88154: getting the next task for host managed_node2 18823 1726855054.88158: done getting next task for host managed_node2 18823 1726855054.88159: ^ task is: TASK: meta (flush_handlers) 18823 1726855054.88161: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855054.88164: getting variables 18823 1726855054.88165: in VariableManager get_vars() 18823 1726855054.88174: Calling all_inventory to load vars for managed_node2 18823 1726855054.88176: Calling groups_inventory to load vars for managed_node2 18823 1726855054.88179: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855054.88184: Calling all_plugins_play to load vars for managed_node2 18823 1726855054.88186: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855054.88192: Calling groups_plugins_play to load vars for managed_node2 18823 1726855054.89786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855054.91410: done with get_vars() 18823 1726855054.91443: done getting variables 18823 1726855054.91501: in VariableManager get_vars() 18823 1726855054.91514: Calling all_inventory to load vars for managed_node2 18823 1726855054.91517: Calling groups_inventory to load vars for managed_node2 18823 1726855054.91519: Calling all_plugins_inventory to load vars for managed_node2 18823 1726855054.91525: Calling all_plugins_play to load vars for managed_node2 18823 1726855054.91527: Calling groups_plugins_inventory to load vars for managed_node2 18823 1726855054.91530: Calling groups_plugins_play to load vars for managed_node2 18823 1726855054.92774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18823 1726855054.94549: done with get_vars() 18823 1726855054.94579: done queuing things up, now waiting for results queue to drain 18823 1726855054.94581: results queue empty 18823 1726855054.94581: checking for any_errors_fatal 18823 1726855054.94583: done checking for any_errors_fatal 18823 1726855054.94583: checking for max_fail_percentage 18823 1726855054.94584: done checking for max_fail_percentage 18823 1726855054.94585: checking to see if all hosts have failed and the running result is not ok 18823 1726855054.94585: done checking to see if all hosts have failed 18823 1726855054.94586: getting the remaining hosts for this loop 18823 1726855054.94589: done getting the remaining hosts for this loop 18823 1726855054.94591: getting the next task for host managed_node2 18823 1726855054.94594: done getting next task for host managed_node2 18823 1726855054.94595: ^ task is: None 18823 1726855054.94596: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18823 1726855054.94597: done queuing things up, now waiting for results queue to drain 18823 1726855054.94598: results queue empty 18823 1726855054.94599: checking for any_errors_fatal 18823 1726855054.94599: done checking for any_errors_fatal 18823 1726855054.94600: checking for max_fail_percentage 18823 1726855054.94601: done checking for max_fail_percentage 18823 1726855054.94601: checking to see if all hosts have failed and the running result is not ok 18823 1726855054.94602: done checking to see if all hosts have failed 18823 1726855054.94603: getting the next task for host managed_node2 18823 1726855054.94607: done getting next task for host managed_node2 18823 1726855054.94608: ^ task is: None 18823 1726855054.94609: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=83 changed=3 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Friday 20 September 2024 13:57:34 -0400 (0:00:00.754) 0:00:46.598 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.13s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.78s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.59s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.31s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.26s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.21s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Create veth interface lsr27 --------------------------------------------- 1.15s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 1.08s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Gathering Facts --------------------------------------------------------- 0.94s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.93s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Install iproute --------------------------------------------------------- 0.83s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 18823 1726855054.94724: RUNNING CLEANUP